Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Cluster Centers | K-Medoids Algorithm
Cluster Analysis in Python
course content

Course Content

Cluster Analysis in Python

Cluster Analysis in Python

1. K-Means Algorithm
2. K-Medoids Algorithm
3. Hierarchical Clustering
4. Spectral Clustering

Cluster Centers

In the previous chapter, we compared the K-Means and K-Medoids algorithms. It was mentioned that, unlike K-Means, the K-Medoids algorithm has data points as medoids. Let's visualize the differences.

For example, for the training dataset with three clusters we can visualize the clusters and their centers for both K-Means and K-Medoids. The results are displayed below.

If you watch closely to the right chart and compare it with the central one, you will notice that marked points are data points, unlike the points on the left chart. I've marked these points with green arrows. Let's represent the difference using a more obvious example! The result of clustering using the K-Means algorithm, and cluster centers (labeled with 10) are displayed below.

Task

For the same dataset of points data compute the K-Medoids algorithm, and display the cluster centers. Follow the next steps:

  1. Import KMedoids function from sklearn_extra.cluster library.
  2. Create KMedoids model model with 2 clusters.
  3. Fit the data to the model.
  4. Add 'prediction' column with predicted by model labels to data.
  5. Use the .cluster_centers_ method to extract the clusters centers array.
  6. Visualize the clusters (already done) and centers. Within the second .scatterplot function set hue to 10, and s to 150.

Task

For the same dataset of points data compute the K-Medoids algorithm, and display the cluster centers. Follow the next steps:

  1. Import KMedoids function from sklearn_extra.cluster library.
  2. Create KMedoids model model with 2 clusters.
  3. Fit the data to the model.
  4. Add 'prediction' column with predicted by model labels to data.
  5. Use the .cluster_centers_ method to extract the clusters centers array.
  6. Visualize the clusters (already done) and centers. Within the second .scatterplot function set hue to 10, and s to 150.

Switch to desktop for real-world practiceContinue from where you are using one of the options below

Everything was clear?

Section 2. Chapter 2
toggle bottom row

Cluster Centers

In the previous chapter, we compared the K-Means and K-Medoids algorithms. It was mentioned that, unlike K-Means, the K-Medoids algorithm has data points as medoids. Let's visualize the differences.

For example, for the training dataset with three clusters we can visualize the clusters and their centers for both K-Means and K-Medoids. The results are displayed below.

If you watch closely to the right chart and compare it with the central one, you will notice that marked points are data points, unlike the points on the left chart. I've marked these points with green arrows. Let's represent the difference using a more obvious example! The result of clustering using the K-Means algorithm, and cluster centers (labeled with 10) are displayed below.

Task

For the same dataset of points data compute the K-Medoids algorithm, and display the cluster centers. Follow the next steps:

  1. Import KMedoids function from sklearn_extra.cluster library.
  2. Create KMedoids model model with 2 clusters.
  3. Fit the data to the model.
  4. Add 'prediction' column with predicted by model labels to data.
  5. Use the .cluster_centers_ method to extract the clusters centers array.
  6. Visualize the clusters (already done) and centers. Within the second .scatterplot function set hue to 10, and s to 150.

Task

For the same dataset of points data compute the K-Medoids algorithm, and display the cluster centers. Follow the next steps:

  1. Import KMedoids function from sklearn_extra.cluster library.
  2. Create KMedoids model model with 2 clusters.
  3. Fit the data to the model.
  4. Add 'prediction' column with predicted by model labels to data.
  5. Use the .cluster_centers_ method to extract the clusters centers array.
  6. Visualize the clusters (already done) and centers. Within the second .scatterplot function set hue to 10, and s to 150.

Switch to desktop for real-world practiceContinue from where you are using one of the options below

Everything was clear?

Section 2. Chapter 2
toggle bottom row

Cluster Centers

In the previous chapter, we compared the K-Means and K-Medoids algorithms. It was mentioned that, unlike K-Means, the K-Medoids algorithm has data points as medoids. Let's visualize the differences.

For example, for the training dataset with three clusters we can visualize the clusters and their centers for both K-Means and K-Medoids. The results are displayed below.

If you watch closely to the right chart and compare it with the central one, you will notice that marked points are data points, unlike the points on the left chart. I've marked these points with green arrows. Let's represent the difference using a more obvious example! The result of clustering using the K-Means algorithm, and cluster centers (labeled with 10) are displayed below.

Task

For the same dataset of points data compute the K-Medoids algorithm, and display the cluster centers. Follow the next steps:

  1. Import KMedoids function from sklearn_extra.cluster library.
  2. Create KMedoids model model with 2 clusters.
  3. Fit the data to the model.
  4. Add 'prediction' column with predicted by model labels to data.
  5. Use the .cluster_centers_ method to extract the clusters centers array.
  6. Visualize the clusters (already done) and centers. Within the second .scatterplot function set hue to 10, and s to 150.

Task

For the same dataset of points data compute the K-Medoids algorithm, and display the cluster centers. Follow the next steps:

  1. Import KMedoids function from sklearn_extra.cluster library.
  2. Create KMedoids model model with 2 clusters.
  3. Fit the data to the model.
  4. Add 'prediction' column with predicted by model labels to data.
  5. Use the .cluster_centers_ method to extract the clusters centers array.
  6. Visualize the clusters (already done) and centers. Within the second .scatterplot function set hue to 10, and s to 150.

Switch to desktop for real-world practiceContinue from where you are using one of the options below

Everything was clear?

In the previous chapter, we compared the K-Means and K-Medoids algorithms. It was mentioned that, unlike K-Means, the K-Medoids algorithm has data points as medoids. Let's visualize the differences.

For example, for the training dataset with three clusters we can visualize the clusters and their centers for both K-Means and K-Medoids. The results are displayed below.

If you watch closely to the right chart and compare it with the central one, you will notice that marked points are data points, unlike the points on the left chart. I've marked these points with green arrows. Let's represent the difference using a more obvious example! The result of clustering using the K-Means algorithm, and cluster centers (labeled with 10) are displayed below.

Task

For the same dataset of points data compute the K-Medoids algorithm, and display the cluster centers. Follow the next steps:

  1. Import KMedoids function from sklearn_extra.cluster library.
  2. Create KMedoids model model with 2 clusters.
  3. Fit the data to the model.
  4. Add 'prediction' column with predicted by model labels to data.
  5. Use the .cluster_centers_ method to extract the clusters centers array.
  6. Visualize the clusters (already done) and centers. Within the second .scatterplot function set hue to 10, and s to 150.

Switch to desktop for real-world practiceContinue from where you are using one of the options below
Section 2. Chapter 2
Switch to desktop for real-world practiceContinue from where you are using one of the options below
We're sorry to hear that something went wrong. What happened?
some-alt