Introduction to Matrix Decomposition
Solving systems like Ax=b can be computationally intensive, especially for large systems.
Matrix decomposition simplifies this process by breaking matrix A into simpler parts — which we can then solve in stages.
LU vs QR – When and Why
We decompose the matrix A into other structured matrices.
LU Decomposition
Break A into a Lower and Upper triangular matrix:
- Built using Gaussian elimination
- Works best for square matrices
QR Decomposition
Break A into an Orthogonal and Upper matrix:
- Often used for non-square matrices
- Ideal for least squares problems or when LU fails
LU Decomposition – Setup
Start with a square matrix:
A=[4633]Our goal is to write this as:
A=LUWhere:
L=[1l2101], U=[u110u12u22]This decomposition is possible if A is square and invertible.
Important Points:
- Lower triangular matrices have all zero entries above the diagonal, simplifying forward substitution;
- Upper triangular matrices have zeros below the diagonal, making backward substitution straightforward;
- An orthogonal matrix has columns that are orthonormal vectors (vectors of length 1 that are perpendicular);
- This property preserves vector length and angles, which is useful in solving least squares and improving numerical stability.
Gaussian Elimination
Apply Gaussian elimination to eliminate the entry below the top-left pivot:
R2→R2−46R1This gives us:
R2′=[0,−1.5]So the updated matrices become:
U=[403−1.5]And from our row operation, we know:
L=[11.501]Important Points:
- Gaussian elimination systematically eliminates entries below the pivot element in each column by subtracting scaled versions of the pivot row from the rows beneath;
- This process transforms A into an upper triangular matrix U.
- The multipliers used to eliminate these entries are stored in L, allowing us to represent A as the product LU.
Final Result – LU Decomposition Complete
We verify:
A=LU=[11.501][403−1.5]=[4633]Now the system Ax=b can be solved in two steps:
- Solve Ly=b by forward substitution;
- Solve Ux=y by backward substitution.
QR Decomposition – Concept Overview
We want to express a matrix A as a product of two matrices:
A=QRWhere:
- A is your input matrix (e.g. data, coefficients, etc.);
- Q is an orthogonal matrix (its columns are orthonormal vectors);
- R is an upper triangular matrix.
An example shape breakdown:
A=[a1a3a2a4]=[q1q3q2q4][r110r12r22]This decomposition is often used when:
- Matrix A is not square;
- Solving least squares problems;
- LU decomposition isn't stable.
What Are Orthonormal Vectors?
Orthogonal vectors
Two vectors u,v are orthogonal if their dot product is zero: u⋅v=0
Normalized vector
A vector u is normalized when ∣u∣=1
Orthonormal set
A set of vectors {q1,q2,...,qk} is orthonormal if each is unit length and they are mutually orthogonal:
qi⋅qj={1, if i=j,0, if i=j.Why it matters: orthonormal columns in Q preserve geometry, simplify projections, and improve numerical stability.
QR Decomposition - Setup
Define the Matrix A
Let's start with this example:
A=[4633]We will use the Gram-Schmidt process to find matrices Q and R such that A=QR. The Gram-Schmidt process creates an orthonormal set of vectors from the columns of A.
This means the vectors in Q are all perpendicular (orthogonal) to each other and have unit length (normalized). This property simplifies many calculations and improves numerical stability when solving systems.
So, here the goal is to:
- Make the columns of Q orthonormal;
- Create the matrix R which will encode the projections.
Compute First Basis Vector
We extract the first column of A:
a1=[46]To normalize this, we compute the norm:
∣a1∣=42+62=16+36=52Then:
q1=521[46]=[524526]This is the first orthonormal vector for Q.
How to Normalize a Vector
Given a vector:
v=v1v2⋮vnWe compute its norm:
∣v∣=v12+v22+...+vn2Then normalize:
v^=∣v∣1vExample:
v=[34], ∣v∣=32+42=5So, our normalized vector is:
v^=51[34]=[0.60.8]Once we know how to normalize and orthogonalize vectors, we can apply the Gram-Schmidt process to form the Q matrix, and use it to compute R in the QR decomposition.
Compute q₂ Using Gram-Schmidt
To compute q2, we start with the second column of A:
a2=[33]Next, you project a2 onto q1:
r12=q1Ta2=521(4⋅3+6⋅3)=521⋅30Remove the projection from a2:
u2=a2−r12q1Then normalize (as was shown above):
q2=∣u2∣u2Now both q1 and q2 form the orthonormal basis for Q. You now assemble the final result:
Q=[q1q2], R=[r110r12r22]These satisfy:
A=QRQuiz
1. What is the shape of matrix R in QR decomposition for a 3×2 matrix A?
2. Which of the following correctly defines an orthonormal column vector?
3. If u⋅v=0 and ∣u∣=1, what identity holds true?
4. What is the first step in the Gram-Schmidt process for QR decomposition?
5. In QR decomposition using Gram-Schmidt, what does the projection of a2 onto q1 represent?
6. Let A be a 3×2 matrix. Which of the following is correct after QR decomposition?
Takk for tilbakemeldingene dine!
Spør AI
Spør AI
Spør om hva du vil, eller prøv ett av de foreslåtte spørsmålene for å starte chatten vår
Awesome!
Completion rate improved to 1.89
Introduction to Matrix Decomposition
Sveip for å vise menyen
Solving systems like Ax=b can be computationally intensive, especially for large systems.
Matrix decomposition simplifies this process by breaking matrix A into simpler parts — which we can then solve in stages.
LU vs QR – When and Why
We decompose the matrix A into other structured matrices.
LU Decomposition
Break A into a Lower and Upper triangular matrix:
- Built using Gaussian elimination
- Works best for square matrices
QR Decomposition
Break A into an Orthogonal and Upper matrix:
- Often used for non-square matrices
- Ideal for least squares problems or when LU fails
LU Decomposition – Setup
Start with a square matrix:
A=[4633]Our goal is to write this as:
A=LUWhere:
L=[1l2101], U=[u110u12u22]This decomposition is possible if A is square and invertible.
Important Points:
- Lower triangular matrices have all zero entries above the diagonal, simplifying forward substitution;
- Upper triangular matrices have zeros below the diagonal, making backward substitution straightforward;
- An orthogonal matrix has columns that are orthonormal vectors (vectors of length 1 that are perpendicular);
- This property preserves vector length and angles, which is useful in solving least squares and improving numerical stability.
Gaussian Elimination
Apply Gaussian elimination to eliminate the entry below the top-left pivot:
R2→R2−46R1This gives us:
R2′=[0,−1.5]So the updated matrices become:
U=[403−1.5]And from our row operation, we know:
L=[11.501]Important Points:
- Gaussian elimination systematically eliminates entries below the pivot element in each column by subtracting scaled versions of the pivot row from the rows beneath;
- This process transforms A into an upper triangular matrix U.
- The multipliers used to eliminate these entries are stored in L, allowing us to represent A as the product LU.
Final Result – LU Decomposition Complete
We verify:
A=LU=[11.501][403−1.5]=[4633]Now the system Ax=b can be solved in two steps:
- Solve Ly=b by forward substitution;
- Solve Ux=y by backward substitution.
QR Decomposition – Concept Overview
We want to express a matrix A as a product of two matrices:
A=QRWhere:
- A is your input matrix (e.g. data, coefficients, etc.);
- Q is an orthogonal matrix (its columns are orthonormal vectors);
- R is an upper triangular matrix.
An example shape breakdown:
A=[a1a3a2a4]=[q1q3q2q4][r110r12r22]This decomposition is often used when:
- Matrix A is not square;
- Solving least squares problems;
- LU decomposition isn't stable.
What Are Orthonormal Vectors?
Orthogonal vectors
Two vectors u,v are orthogonal if their dot product is zero: u⋅v=0
Normalized vector
A vector u is normalized when ∣u∣=1
Orthonormal set
A set of vectors {q1,q2,...,qk} is orthonormal if each is unit length and they are mutually orthogonal:
qi⋅qj={1, if i=j,0, if i=j.Why it matters: orthonormal columns in Q preserve geometry, simplify projections, and improve numerical stability.
QR Decomposition - Setup
Define the Matrix A
Let's start with this example:
A=[4633]We will use the Gram-Schmidt process to find matrices Q and R such that A=QR. The Gram-Schmidt process creates an orthonormal set of vectors from the columns of A.
This means the vectors in Q are all perpendicular (orthogonal) to each other and have unit length (normalized). This property simplifies many calculations and improves numerical stability when solving systems.
So, here the goal is to:
- Make the columns of Q orthonormal;
- Create the matrix R which will encode the projections.
Compute First Basis Vector
We extract the first column of A:
a1=[46]To normalize this, we compute the norm:
∣a1∣=42+62=16+36=52Then:
q1=521[46]=[524526]This is the first orthonormal vector for Q.
How to Normalize a Vector
Given a vector:
v=v1v2⋮vnWe compute its norm:
∣v∣=v12+v22+...+vn2Then normalize:
v^=∣v∣1vExample:
v=[34], ∣v∣=32+42=5So, our normalized vector is:
v^=51[34]=[0.60.8]Once we know how to normalize and orthogonalize vectors, we can apply the Gram-Schmidt process to form the Q matrix, and use it to compute R in the QR decomposition.
Compute q₂ Using Gram-Schmidt
To compute q2, we start with the second column of A:
a2=[33]Next, you project a2 onto q1:
r12=q1Ta2=521(4⋅3+6⋅3)=521⋅30Remove the projection from a2:
u2=a2−r12q1Then normalize (as was shown above):
q2=∣u2∣u2Now both q1 and q2 form the orthonormal basis for Q. You now assemble the final result:
Q=[q1q2], R=[r110r12r22]These satisfy:
A=QRQuiz
1. What is the shape of matrix R in QR decomposition for a 3×2 matrix A?
2. Which of the following correctly defines an orthonormal column vector?
3. If u⋅v=0 and ∣u∣=1, what identity holds true?
4. What is the first step in the Gram-Schmidt process for QR decomposition?
5. In QR decomposition using Gram-Schmidt, what does the projection of a2 onto q1 represent?
6. Let A be a 3×2 matrix. Which of the following is correct after QR decomposition?
Takk for tilbakemeldingene dine!