Linear Regression with N Features
メニューを表示するにはスワイプしてください
N-Feature Linear Regression Equation
As we have seen, adding the new feature to the linear regression model is as easy as adding it along with the new parameter to the model's equation. We can add much more than two parameters that way.
Consider n to be a whole number greater than two.
Where:
- β0,β1,β2,…,βn – are the model's parameters;
- ypred – is the prediction of a target;
- x1 – is the first feature value;
- x2 – is the second feature value;
- …
- xn – is the n-th feature value.
Normal Equation
The only problem is the visualization. If we have two parameters, we need to build a 3D plot. But if we have more than two parameters, the plot will be more than three-dimensional. But we live in a 3-dimensional world and cannot imagine higher-dimensional plots. However, it is not necessary to visualize the result. We only need to find the parameters for the model to work. Luckily, it is relatively easy to find them. The good old Normal Equation will help us:
β=β0β1…βn=(X~TX~)−1X~TytrueWhere:
- β0,β1,…,βn – are the model's parameters;
- X~ – is a matrix, containing 1s as a first column, and X1−Xn as other columns:
- Xk – is an array of k-th feature values from the training set;
- ytrue – is an array of target values from the training set.
X̃ Matrix
Notice that only the X̃ matrix has changed. You can think of the columns of this matrix as each responsible for its β parameter. The following video explains what I mean.
The first column of 1s is needed to find the β₀ parameter.
フィードバックありがとうございます!
AIに質問する
AIに質問する
何でも質問するか、提案された質問の1つを試してチャットを始めてください