Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Linear Regression with n Features | Multiple Linear Regression
Linear Regression for ML
course content

Зміст курсу

Linear Regression for ML

Linear Regression for ML

1. Simple Linear Regression
2. Multiple Linear Regression
3. Polynomial Regression
4. Evaluating and Comparing Models

bookLinear Regression with n Features

n-feature Linear Regression Equation

As we have seen, adding the new feature to the linear regression model is as easy as adding it along with the new parameter to the model's equation.
We can add much more than two parameters that way.

Note

Consider n to be a whole number greater than two.

The only problem is the visualization.
If we have two parameters, we need to build a 3D plot. But if we have more than two parameters, the plot will be more than three-dimensional.
However, we live in a 3-dimensional world and cannot imagine higher-dimensional plots.
Anyway, it is unnecessary to visualize the result. We only need to find the parameters for the model to work. Luckily, it is relatively easy to find them.
The good old Normal Equation helps us with it.

Normal Equation

Compared to the Normal Equation we used for Simple Linear Regression, only the matrix has changed.
Again, we will not dive into the maths behind it; the point is, finding parameters for n-feature Linear Regression uses the same approach as for Simple Linear Regression (but is a little more computationally expensive).

Все було зрозуміло?

Як ми можемо покращити це?

Дякуємо за ваш відгук!

Секція 2. Розділ 2
some-alt