Зміст курсу
Linear Regression for ML
Linear Regression for ML
Polynomial Regression
In the previous chapter, we explored quadratic regression, which has the graph of a parabola.
In the same way, we could add the x³ to the equation to get the Cubic Regression that has a more complex graph.
We could also add x⁴ and so on...
A degree of a Polynomial Regression
It is generally called the polynomial equation and is the equation of Polynomial Regression.
The highest power of x defines a degree of a Polynomial Regression in the equation. Here is an example:
n-degree Polynomial Regression
Considering n to be a whole number greater than two, we can write down the equation of an n-degree Polynomial Regression.
Normal Equation
And as always, parameters are found using the Normal Equation:
Polynomial Regression is actually a special case of Multiple Linear Regression.
But instead of multiple whole different features (x₁, x₂, ..., xₙ), we use features that are results of raising x to different powers (x, x², ..., xⁿ).
Therefore, in a Normal Equation, columns of X̃ are the X
raised to different powers.
Polynomial Regression with multiple features
You can use the Polynomial Regression with more than one feature to create even more complex shapes.
But even with two features, 2-degree Polynomial Regression has quite a long equation.
Using Polynomial Regression with many features will increase the dataset's size crucially.
There are a lot of problems associated with it, so for wide datasets (with many features), Polynomial Regression is a bad choice. Other ML models usually handle such datasets better.
Дякуємо за ваш відгук!