Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Peculiarity of Spectral Clustering | Spectral Clustering
Cluster Analysis in Python
course content

Course Content

Cluster Analysis in Python

Cluster Analysis in Python

1. K-Means Algorithm
2. K-Medoids Algorithm
3. Hierarchical Clustering
4. Spectral Clustering

Peculiarity of Spectral Clustering

The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.

Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.

For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.

Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.

Not what we expected to see. Let's see how will spectral clustering deal with this data.

Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.

Task

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Task

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Switch to desktop for real-world practiceContinue from where you are using one of the options below

Everything was clear?

Section 4. Chapter 2
toggle bottom row

Peculiarity of Spectral Clustering

The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.

Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.

For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.

Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.

Not what we expected to see. Let's see how will spectral clustering deal with this data.

Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.

Task

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Task

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Switch to desktop for real-world practiceContinue from where you are using one of the options below

Everything was clear?

Section 4. Chapter 2
toggle bottom row

Peculiarity of Spectral Clustering

The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.

Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.

For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.

Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.

Not what we expected to see. Let's see how will spectral clustering deal with this data.

Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.

Task

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Task

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Switch to desktop for real-world practiceContinue from where you are using one of the options below

Everything was clear?

The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.

Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.

For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.

Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.

Not what we expected to see. Let's see how will spectral clustering deal with this data.

Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.

Task

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Switch to desktop for real-world practiceContinue from where you are using one of the options below
Section 4. Chapter 2
Switch to desktop for real-world practiceContinue from where you are using one of the options below
We're sorry to hear that something went wrong. What happened?
some-alt