Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Lære Linear Regression with n Features | Multiple Linear Regression
Linear Regression for ML
course content

Kursusindhold

Linear Regression for ML

Linear Regression for ML

1. Simple Linear Regression
2. Multiple Linear Regression
3. Polynomial Regression
4. Evaluating and Comparing Models

book
Linear Regression with n Features

n-feature Linear Regression Equation

As we have seen, adding the new feature to the linear regression model is as easy as adding it along with the new parameter to the model's equation.
We can add much more than two parameters that way.

Note

Consider n to be a whole number greater than two.

The only problem is the visualization.
If we have two parameters, we need to build a 3D plot. But if we have more than two parameters, the plot will be more than three-dimensional.
However, we live in a 3-dimensional world and cannot imagine higher-dimensional plots.
Anyway, it is unnecessary to visualize the result. We only need to find the parameters for the model to work. Luckily, it is relatively easy to find them.
The good old Normal Equation helps us with it.

Normal Equation

Compared to the Normal Equation we used for Simple Linear Regression, only the matrix has changed.
Again, we will not dive into the maths behind it; the point is, finding parameters for n-feature Linear Regression uses the same approach as for Simple Linear Regression (but is a little more computationally expensive).

Var alt klart?

Hvordan kan vi forbedre det?

Tak for dine kommentarer!

Sektion 2. Kapitel 2

Spørg AI

expand
ChatGPT

Spørg om hvad som helst eller prøv et af de foreslåede spørgsmål for at starte vores chat

course content

Kursusindhold

Linear Regression for ML

Linear Regression for ML

1. Simple Linear Regression
2. Multiple Linear Regression
3. Polynomial Regression
4. Evaluating and Comparing Models

book
Linear Regression with n Features

n-feature Linear Regression Equation

As we have seen, adding the new feature to the linear regression model is as easy as adding it along with the new parameter to the model's equation.
We can add much more than two parameters that way.

Note

Consider n to be a whole number greater than two.

The only problem is the visualization.
If we have two parameters, we need to build a 3D plot. But if we have more than two parameters, the plot will be more than three-dimensional.
However, we live in a 3-dimensional world and cannot imagine higher-dimensional plots.
Anyway, it is unnecessary to visualize the result. We only need to find the parameters for the model to work. Luckily, it is relatively easy to find them.
The good old Normal Equation helps us with it.

Normal Equation

Compared to the Normal Equation we used for Simple Linear Regression, only the matrix has changed.
Again, we will not dive into the maths behind it; the point is, finding parameters for n-feature Linear Regression uses the same approach as for Simple Linear Regression (but is a little more computationally expensive).

Var alt klart?

Hvordan kan vi forbedre det?

Tak for dine kommentarer!

Sektion 2. Kapitel 2
Vi beklager, at noget gik galt. Hvad skete der?
some-alt