Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
MSE and RMSE | Metrics to Evaluate the Model
Explore the Linear Regression Using Python
course content

Conteúdo do Curso

Explore the Linear Regression Using Python

Explore the Linear Regression Using Python

1. What is the Linear Regression?
2. Correlation
3. Building and Training Model
4. Metrics to Evaluate the Model
5. Multivariate Linear Regression

bookMSE and RMSE

How else can you get rid of the minus sign? We could square our differences, that is, we take our predictions, subtract the true values ​​from them and square them, thereby obtaining the squared deviation. Now we can sum all the squared deviations and divide by their number of objects. This value shows how much our predictions deviate from the truth squared. It is called the mean square error or MSE. When the MSE goes to zero, in this case, all true values ​​match our predictions perfectly.

We can calculate it:

1
MSE = (residuals**2).mean()
copy

Or just use the similar to the previous chapter method:

12
from sklearn.metrics import mean_squared_error print(mean_squared_error(Y_test, y_test_predicted))
copy

But there is a small snag here. It is important to note that when calculating the error, the MSE units do not match the original units of the predicted target value. We predict the number of phenols squared, which is sometimes a bit difficult to use for estimation. You can get rid of this square quite simply by taking the square root of MSE. It calls the root mean squared error (RMSE). Thus, it may be common to use the MSE loss to train a regression prediction model and use the RMSE to evaluate and report its performance.

1
RMSE = math.sqrt((residuals**2).mean())
copy

The formula:

Ideally, all the metrics that we considered (MSE, RMSE, MAE) tend to zero when all true values are very close to predictions. But these metrics do not have a right border above which we consider metrics to be useless. Is our prediction good if MSE=13 or not?

Let’s look at an example:

In both cases MSE = 13. For the first table, this is a very large value. We are more than twice wrong in our forecasts. At the same time, for the second sample, this is a fairly good estimate.

Tarefa

Let’s continue calculating metrics for our dataset. Find MSE and RMSE:

  1. [Line #32] Import mean_squared_error from the library scikit.metrics for calculating metrics.
  2. [Line #33] Use the method mean_squared_error() to find MSE and assign it to the variable MSE.
  3. [Line #34] Print the variable MSE.
  4. [Line #36] Import the library math.
  5. [Line #37] Find RMSE, assign it to the variable RMSE and print it.
  6. [Line #38] Print the variable MSE.

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

Seção 4. Capítulo 3
toggle bottom row

bookMSE and RMSE

How else can you get rid of the minus sign? We could square our differences, that is, we take our predictions, subtract the true values ​​from them and square them, thereby obtaining the squared deviation. Now we can sum all the squared deviations and divide by their number of objects. This value shows how much our predictions deviate from the truth squared. It is called the mean square error or MSE. When the MSE goes to zero, in this case, all true values ​​match our predictions perfectly.

We can calculate it:

1
MSE = (residuals**2).mean()
copy

Or just use the similar to the previous chapter method:

12
from sklearn.metrics import mean_squared_error print(mean_squared_error(Y_test, y_test_predicted))
copy

But there is a small snag here. It is important to note that when calculating the error, the MSE units do not match the original units of the predicted target value. We predict the number of phenols squared, which is sometimes a bit difficult to use for estimation. You can get rid of this square quite simply by taking the square root of MSE. It calls the root mean squared error (RMSE). Thus, it may be common to use the MSE loss to train a regression prediction model and use the RMSE to evaluate and report its performance.

1
RMSE = math.sqrt((residuals**2).mean())
copy

The formula:

Ideally, all the metrics that we considered (MSE, RMSE, MAE) tend to zero when all true values are very close to predictions. But these metrics do not have a right border above which we consider metrics to be useless. Is our prediction good if MSE=13 or not?

Let’s look at an example:

In both cases MSE = 13. For the first table, this is a very large value. We are more than twice wrong in our forecasts. At the same time, for the second sample, this is a fairly good estimate.

Tarefa

Let’s continue calculating metrics for our dataset. Find MSE and RMSE:

  1. [Line #32] Import mean_squared_error from the library scikit.metrics for calculating metrics.
  2. [Line #33] Use the method mean_squared_error() to find MSE and assign it to the variable MSE.
  3. [Line #34] Print the variable MSE.
  4. [Line #36] Import the library math.
  5. [Line #37] Find RMSE, assign it to the variable RMSE and print it.
  6. [Line #38] Print the variable MSE.

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

Seção 4. Capítulo 3
toggle bottom row

bookMSE and RMSE

How else can you get rid of the minus sign? We could square our differences, that is, we take our predictions, subtract the true values ​​from them and square them, thereby obtaining the squared deviation. Now we can sum all the squared deviations and divide by their number of objects. This value shows how much our predictions deviate from the truth squared. It is called the mean square error or MSE. When the MSE goes to zero, in this case, all true values ​​match our predictions perfectly.

We can calculate it:

1
MSE = (residuals**2).mean()
copy

Or just use the similar to the previous chapter method:

12
from sklearn.metrics import mean_squared_error print(mean_squared_error(Y_test, y_test_predicted))
copy

But there is a small snag here. It is important to note that when calculating the error, the MSE units do not match the original units of the predicted target value. We predict the number of phenols squared, which is sometimes a bit difficult to use for estimation. You can get rid of this square quite simply by taking the square root of MSE. It calls the root mean squared error (RMSE). Thus, it may be common to use the MSE loss to train a regression prediction model and use the RMSE to evaluate and report its performance.

1
RMSE = math.sqrt((residuals**2).mean())
copy

The formula:

Ideally, all the metrics that we considered (MSE, RMSE, MAE) tend to zero when all true values are very close to predictions. But these metrics do not have a right border above which we consider metrics to be useless. Is our prediction good if MSE=13 or not?

Let’s look at an example:

In both cases MSE = 13. For the first table, this is a very large value. We are more than twice wrong in our forecasts. At the same time, for the second sample, this is a fairly good estimate.

Tarefa

Let’s continue calculating metrics for our dataset. Find MSE and RMSE:

  1. [Line #32] Import mean_squared_error from the library scikit.metrics for calculating metrics.
  2. [Line #33] Use the method mean_squared_error() to find MSE and assign it to the variable MSE.
  3. [Line #34] Print the variable MSE.
  4. [Line #36] Import the library math.
  5. [Line #37] Find RMSE, assign it to the variable RMSE and print it.
  6. [Line #38] Print the variable MSE.

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

How else can you get rid of the minus sign? We could square our differences, that is, we take our predictions, subtract the true values ​​from them and square them, thereby obtaining the squared deviation. Now we can sum all the squared deviations and divide by their number of objects. This value shows how much our predictions deviate from the truth squared. It is called the mean square error or MSE. When the MSE goes to zero, in this case, all true values ​​match our predictions perfectly.

We can calculate it:

1
MSE = (residuals**2).mean()
copy

Or just use the similar to the previous chapter method:

12
from sklearn.metrics import mean_squared_error print(mean_squared_error(Y_test, y_test_predicted))
copy

But there is a small snag here. It is important to note that when calculating the error, the MSE units do not match the original units of the predicted target value. We predict the number of phenols squared, which is sometimes a bit difficult to use for estimation. You can get rid of this square quite simply by taking the square root of MSE. It calls the root mean squared error (RMSE). Thus, it may be common to use the MSE loss to train a regression prediction model and use the RMSE to evaluate and report its performance.

1
RMSE = math.sqrt((residuals**2).mean())
copy

The formula:

Ideally, all the metrics that we considered (MSE, RMSE, MAE) tend to zero when all true values are very close to predictions. But these metrics do not have a right border above which we consider metrics to be useless. Is our prediction good if MSE=13 or not?

Let’s look at an example:

In both cases MSE = 13. For the first table, this is a very large value. We are more than twice wrong in our forecasts. At the same time, for the second sample, this is a fairly good estimate.

Tarefa

Let’s continue calculating metrics for our dataset. Find MSE and RMSE:

  1. [Line #32] Import mean_squared_error from the library scikit.metrics for calculating metrics.
  2. [Line #33] Use the method mean_squared_error() to find MSE and assign it to the variable MSE.
  3. [Line #34] Print the variable MSE.
  4. [Line #36] Import the library math.
  5. [Line #37] Find RMSE, assign it to the variable RMSE and print it.
  6. [Line #38] Print the variable MSE.

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Seção 4. Capítulo 3
Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
some-alt