Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
PolynomialFeatures | Polynomial Regression
Linear Regression for ML
course content

Conteúdo do Curso

Linear Regression for ML

Linear Regression for ML

1. Simple Linear Regression
2. Multiple Linear Regression
3. Polynomial Regression
4. Evaluating and Comparing Models

PolynomialFeatures

As was discussed earlier, Polynomial Regression is a special case of Multiple Linear Regression.
We can also use the LinearRegression class since it is also solved using the Normal Equation.
But it now needs some preprocessing.

The LinearRegression class underneath performs the Normal Equation.
Let's look at it once again. More specifically, pay attention to the matrix.

The LinearRegression class adds the column of 1s by default.
Also, we have the X column, which is our feature column from a dataset.
Adding features ,...,Xⁿ is not handled by default, so that's what we need to take care of.
One way is to manually add X**2, ... X**n to our df as features.
However, sklearn provides a PolynomialFeatures class that does it for you.

PolynomialFeatures usage

The PolynomialFeatures is a scikit-learn Transformer. Thus you can transform the X using the .fit_transform() method.

Note

As was mentioned earlier, the LinearRegression class automatically adds the 1s column.
By default, the PolynomialFeatures does it too (it is controlled by the include_bias argument).
Therefore, we must set the include_bias=False if we use PolynomialFeatures before LinearRegression.

Since PolynomialFeatures is a class, we should first initialize it (specifying the degree and include_bias parameters) and then use the .fit_transform() method.
With that all being said, here is the code to apply a PolynomialFeatures:

The code shown above allows building a Polynomial Regression of degree 2. We will build such Regression in the next chapter.

PolynomialFeatures for Multiple Features

You can also use the PolynomialFeatures class for multiple features. Just pass the X with multiple columns to the .fit_transform() method.
But it works in a bit unexpected way. For example, say we have 2 features: X₁ and X₂.
Applying PolynomialFeatures of degree 2 will not only add X₁² and X₂² features but also X₁⋅X₂.
And for 3 features, applying PolynomialFeatures with degree 3 would yield MANY new features:

The number of added features scales polynomially in the number of features of the input array, and exponentially in the degree.
Thus, Polynomial Regression is rarely used when working with wide (ones with many features) datasets.

That's a lot of information!
Let's sum it all up before actually building a Polynomial Regression!

Tudo estava claro?

Seção 3. Capítulo 3
We're sorry to hear that something went wrong. What happened?
some-alt