Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Peculiarity of Spectral Clustering | Spectral Clustering
Cluster Analysis in Python
course content

Contenido del Curso

Cluster Analysis in Python

Cluster Analysis in Python

1. K-Means Algorithm
2. K-Medoids Algorithm
3. Hierarchical Clustering
4. Spectral Clustering

Peculiarity of Spectral Clustering

The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.

Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.

For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.

Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.

Not what we expected to see. Let's see how will spectral clustering deal with this data.

Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.

Tarea

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Tarea

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones

¿Todo estuvo claro?

Sección 4. Capítulo 2
toggle bottom row

Peculiarity of Spectral Clustering

The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.

Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.

For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.

Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.

Not what we expected to see. Let's see how will spectral clustering deal with this data.

Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.

Tarea

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Tarea

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones

¿Todo estuvo claro?

Sección 4. Capítulo 2
toggle bottom row

Peculiarity of Spectral Clustering

The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.

Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.

For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.

Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.

Not what we expected to see. Let's see how will spectral clustering deal with this data.

Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.

Tarea

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Tarea

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones

¿Todo estuvo claro?

The result of the last chapter was great! Spectral clustering correctly figured out the structure of the clusters, unlike K-Means and K-Medoids algorithms.

Thus, spectral clustering is very useful in case of intersect/overlapping clusters or when you can not use mean points and the centers.

For example, let's explore such a case. Given the 2-D training set of points, the scatter plot for which is built below.

Seems like 4 circles, therefore 4 clusters, doesn't it? But that is what K-Means will show us.

Not what we expected to see. Let's see how will spectral clustering deal with this data.

Please note, that the spectral clustering algorithm may take a long time to perform since it is based on hard math.

Tarea

For the given set of 2-D points data perform a spectral clustering. Follow the next steps:

  1. Import SpectralClustering function from sklearn.cluster.
  2. Create a SpectralClustering model with 4 clusters.
  3. Fit the data and predict the labels. Save predicted labels within the 'prediction' column of data.
  4. Build scatter plot with 'x' column on the x-axis 'y' column on the y-axis for each value of 'prediction' (separate color for each value). Do not forget to display the plot.

Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones
Sección 4. Capítulo 2
Cambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones
We're sorry to hear that something went wrong. What happened?
some-alt