PolynomialFeatures
As was discussed earlier, Polynomial Regression is a special case of Multiple Linear Regression.
We can also use the LinearRegression class since it is also solved using the Normal Equation.
But it now needs some preprocessing.
The LinearRegression class underneath performs the Normal Equation.
Let's look at it once again. More specifically, pay attention to the XΜ matrix.
The LinearRegression class adds the column of 1s by default.
Also, we have the X column, which is our feature column from a dataset.
Adding features XΒ²,...,XβΏ is not handled by default, so that's what we need to take care of.
One way is to manually add X**2, ... X**n to our df as features.
However, sklearn provides a PolynomialFeatures class that does it for you.
PolynomialFeatures usage
The PolynomialFeatures is a scikit-learn Transformer. Thus you can transform the X using the .fit_transform() method.
Note
As was mentioned earlier, the
LinearRegressionclass automatically adds the 1s column.
By default, thePolynomialFeaturesdoes it too (it is controlled by theinclude_biasargument).
Therefore, we must set theinclude_bias=Falseif we usePolynomialFeaturesbeforeLinearRegression.
Since PolynomialFeatures is a class, we should first initialize it (specifying the degree and include_bias parameters) and then use the .fit_transform() method.
With that all being said, here is the code to apply a PolynomialFeatures:
X_poly = PolynomialFeatures(2, include_bias=False).fit_transform(X)
The code shown above allows building a Polynomial Regression of degree 2. We will build such Regression in the next chapter.
PolynomialFeatures for Multiple Features
You can also use the PolynomialFeatures class for multiple features. Just pass the X with multiple columns to the .fit_transform() method.
But it works in a bit unexpected way. For example, say we have 2 features: Xβ and Xβ.
Applying PolynomialFeatures of degree 2 will not only add XβΒ² and XβΒ² features but also Xββ
Xβ.
And for 3 features, applying PolynomialFeatures with degree 3 would yield MANY new features:
The number of added features scales polynomially in the number of features of the input array, and exponentially in the degree.
Thus, Polynomial Regression is rarely used when working with wide (ones with many features) datasets.
That's a lot of information!
Let's sum it all up before actually building a Polynomial Regression!
Thanks for your feedback!
Ask AI
Ask AI
Ask anything or try one of the suggested questions to begin our chat
Ask me questions about this topic
Summarize this chapter
Show real-world examples
Awesome!
Completion rate improved to 5.56
PolynomialFeatures
Swipe to show menu
As was discussed earlier, Polynomial Regression is a special case of Multiple Linear Regression.
We can also use the LinearRegression class since it is also solved using the Normal Equation.
But it now needs some preprocessing.
The LinearRegression class underneath performs the Normal Equation.
Let's look at it once again. More specifically, pay attention to the XΜ matrix.
The LinearRegression class adds the column of 1s by default.
Also, we have the X column, which is our feature column from a dataset.
Adding features XΒ²,...,XβΏ is not handled by default, so that's what we need to take care of.
One way is to manually add X**2, ... X**n to our df as features.
However, sklearn provides a PolynomialFeatures class that does it for you.
PolynomialFeatures usage
The PolynomialFeatures is a scikit-learn Transformer. Thus you can transform the X using the .fit_transform() method.
Note
As was mentioned earlier, the
LinearRegressionclass automatically adds the 1s column.
By default, thePolynomialFeaturesdoes it too (it is controlled by theinclude_biasargument).
Therefore, we must set theinclude_bias=Falseif we usePolynomialFeaturesbeforeLinearRegression.
Since PolynomialFeatures is a class, we should first initialize it (specifying the degree and include_bias parameters) and then use the .fit_transform() method.
With that all being said, here is the code to apply a PolynomialFeatures:
X_poly = PolynomialFeatures(2, include_bias=False).fit_transform(X)
The code shown above allows building a Polynomial Regression of degree 2. We will build such Regression in the next chapter.
PolynomialFeatures for Multiple Features
You can also use the PolynomialFeatures class for multiple features. Just pass the X with multiple columns to the .fit_transform() method.
But it works in a bit unexpected way. For example, say we have 2 features: Xβ and Xβ.
Applying PolynomialFeatures of degree 2 will not only add XβΒ² and XβΒ² features but also Xββ
Xβ.
And for 3 features, applying PolynomialFeatures with degree 3 would yield MANY new features:
The number of added features scales polynomially in the number of features of the input array, and exponentially in the degree.
Thus, Polynomial Regression is rarely used when working with wide (ones with many features) datasets.
That's a lot of information!
Let's sum it all up before actually building a Polynomial Regression!
Thanks for your feedback!