Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Oppiskele Linear Regression with n Features | Multiple Linear Regression
Linear Regression for ML

bookLinear Regression with n Features

n-feature Linear Regression Equation

As we have seen, adding the new feature to the linear regression model is as easy as adding it along with the new parameter to the model's equation.
We can add much more than two parameters that way.

Note

Consider n to be a whole number greater than two.

The only problem is the visualization.
If we have two parameters, we need to build a 3D plot. But if we have more than two parameters, the plot will be more than three-dimensional.
However, we live in a 3-dimensional world and cannot imagine higher-dimensional plots.
Anyway, it is unnecessary to visualize the result. We only need to find the parameters for the model to work. Luckily, it is relatively easy to find them.
The good old Normal Equation helps us with it.

Normal Equation

Compared to the Normal Equation we used for Simple Linear Regression, only the matrix has changed.
Again, we will not dive into the maths behind it; the point is, finding parameters for n-feature Linear Regression uses the same approach as for Simple Linear Regression (but is a little more computationally expensive).

Oliko kaikki selvää?

Miten voimme parantaa sitä?

Kiitos palautteestasi!

Osio 2. Luku 2

Kysy tekoälyä

expand

Kysy tekoälyä

ChatGPT

Kysy mitä tahansa tai kokeile jotakin ehdotetuista kysymyksistä aloittaaksesi keskustelumme

Suggested prompts:

Kysy minulta kysymyksiä tästä aiheesta

Tiivistä tämä luku

Näytä käytännön esimerkkejä

Awesome!

Completion rate improved to 5.56

bookLinear Regression with n Features

Pyyhkäise näyttääksesi valikon

n-feature Linear Regression Equation

As we have seen, adding the new feature to the linear regression model is as easy as adding it along with the new parameter to the model's equation.
We can add much more than two parameters that way.

Note

Consider n to be a whole number greater than two.

The only problem is the visualization.
If we have two parameters, we need to build a 3D plot. But if we have more than two parameters, the plot will be more than three-dimensional.
However, we live in a 3-dimensional world and cannot imagine higher-dimensional plots.
Anyway, it is unnecessary to visualize the result. We only need to find the parameters for the model to work. Luckily, it is relatively easy to find them.
The good old Normal Equation helps us with it.

Normal Equation

Compared to the Normal Equation we used for Simple Linear Regression, only the matrix has changed.
Again, we will not dive into the maths behind it; the point is, finding parameters for n-feature Linear Regression uses the same approach as for Simple Linear Regression (but is a little more computationally expensive).

Oliko kaikki selvää?

Miten voimme parantaa sitä?

Kiitos palautteestasi!

Osio 2. Luku 2
some-alt