Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
学ぶ Polynomial Regression | Section
Supervised Learning Essentials

bookPolynomial Regression

メニューを表示するにはスワイプしてください

In the previous chapter, we explored quadratic regression, which has the graph of a parabola. In the same way, we could add the to the equation to get the Cubic Regression that has a more complex graph. We could also add x⁴ and so on.

A Degree of a Polynomial Regression

In general, it is called the polynomial equation and is the equation of Polynomial Regression. The highest power of x defines a degree of a Polynomial Regression in the equation. Here is an example

N-Degree Polynomial Regression

Considering n to be a whole number greater than two, we can write down the equation of an n-degree Polynomial Regression.

ypred=β0+β1x+β2x2++βnxny_{\text{pred}} = \beta_0 + \beta_1 x + \beta_2 x^2 + \dots + \beta_n x^n

Where:

  • β0,β1,β2,,βn\beta_0, \beta_1, \beta_2, \dots, \beta_n – are the model's parameters;
  • ypredy_{\text{pred}} – is the prediction of a target;
  • xx – is the feature value;
  • nn – is the Polynomial Regression's degree.

Normal Equation

And as always, parameters are found using the Normal Equation:

β=(β0β1βn)=(X~TX~)1X~Tytrue\vec{\beta} = \begin{pmatrix} \beta_0 \\ \beta_1 \\ \dots \\ \beta_n \end{pmatrix} = (\tilde{X}^T \tilde{X})^{-1} \tilde{X}^T y_{\text{true}}

Where:

  • β0,β1,,βn\beta_0, \beta_1, \dots, \beta_n – are the model's parameters;
X~=(1XX2Xn)\tilde{X} = \begin{pmatrix} | & | & | & \dots & | \\ 1 & X & X^2 & \dots & X^n \\ | & | & | & \dots & | \end{pmatrix}
  • XX – is an array of feature values from the training set;
  • XkX^k – is the element-wise power of kk of the XX array;
  • ytruey_{\text{true}} – is an array of target values from the training set.

Polynomial Regression with Multiple Features

To create even more complex shapes, you can use the Polynomial Regression with more than one feature. But even with two features, 2-degree Polynomial Regression has quite a long equation.

Most of the time, you won't need that complex model. Simpler models (like Multiple Linear Regression) usually describe the data well enough, and they are much easier to interpret, visualize and less computationally expensive.

question mark

Choose the INCORRECT statement.

正しい答えを選んでください

すべて明確でしたか?

どのように改善できますか?

フィードバックありがとうございます!

セクション 1.  11

AIに質問する

expand

AIに質問する

ChatGPT

何でも質問するか、提案された質問の1つを試してチャットを始めてください

セクション 1.  11
some-alt