Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Lære Polynomial Regression | Polynomial Regression
Linear Regression for ML

bookPolynomial Regression

In the previous chapter, we explored quadratic regression, which has the graph of a parabola.
In the same way, we could add the to the equation to get the Cubic Regression that has a more complex graph.
We could also add x⁴ and so on...

A degree of a Polynomial Regression

It is generally called the polynomial equation and is the equation of Polynomial Regression.
The highest power of x defines a degree of a Polynomial Regression in the equation. Here is an example:

n-degree Polynomial Regression

Considering n to be a whole number greater than two, we can write down the equation of an n-degree Polynomial Regression.

Normal Equation

And as always, parameters are found using the Normal Equation:

Polynomial Regression is actually a special case of Multiple Linear Regression.
But instead of multiple whole different features (x₁, x₂, ..., xₙ), we use features that are results of raising x to different powers (x, , ..., xⁿ).
Therefore, in a Normal Equation, columns of are the X raised to different powers.

Polynomial Regression with multiple features

You can use the Polynomial Regression with more than one feature to create even more complex shapes.
But even with two features, 2-degree Polynomial Regression has quite a long equation.

Using Polynomial Regression with many features will increase the dataset's size crucially.
There are a lot of problems associated with it, so for wide datasets (with many features), Polynomial Regression is a bad choice. Other ML models usually handle such datasets better.

question mark

Choose the INCORRECT statement.

Select the correct answer

Alt var klart?

Hvordan kan vi forbedre det?

Takk for tilbakemeldingene dine!

Seksjon 3. Kapittel 2

Spør AI

expand

Spør AI

ChatGPT

Spør om hva du vil, eller prøv ett av de foreslåtte spørsmålene for å starte chatten vår

Awesome!

Completion rate improved to 5.56

bookPolynomial Regression

Sveip for å vise menyen

In the previous chapter, we explored quadratic regression, which has the graph of a parabola.
In the same way, we could add the to the equation to get the Cubic Regression that has a more complex graph.
We could also add x⁴ and so on...

A degree of a Polynomial Regression

It is generally called the polynomial equation and is the equation of Polynomial Regression.
The highest power of x defines a degree of a Polynomial Regression in the equation. Here is an example:

n-degree Polynomial Regression

Considering n to be a whole number greater than two, we can write down the equation of an n-degree Polynomial Regression.

Normal Equation

And as always, parameters are found using the Normal Equation:

Polynomial Regression is actually a special case of Multiple Linear Regression.
But instead of multiple whole different features (x₁, x₂, ..., xₙ), we use features that are results of raising x to different powers (x, , ..., xⁿ).
Therefore, in a Normal Equation, columns of are the X raised to different powers.

Polynomial Regression with multiple features

You can use the Polynomial Regression with more than one feature to create even more complex shapes.
But even with two features, 2-degree Polynomial Regression has quite a long equation.

Using Polynomial Regression with many features will increase the dataset's size crucially.
There are a lot of problems associated with it, so for wide datasets (with many features), Polynomial Regression is a bad choice. Other ML models usually handle such datasets better.

question mark

Choose the INCORRECT statement.

Select the correct answer

Alt var klart?

Hvordan kan vi forbedre det?

Takk for tilbakemeldingene dine!

Seksjon 3. Kapittel 2
some-alt