Fiveable

🧚🏽‍♀️Abstract Linear Algebra I Unit 9 Review

QR code for Abstract Linear Algebra I practice questions

9.2 QR Decomposition

9.2 QR Decomposition

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🧚🏽‍♀️Abstract Linear Algebra I
Unit & Topic Study Guides

QR decomposition is a powerful tool that breaks down matrices into simpler parts. It's like taking apart a complex machine to understand how it works. This technique is closely tied to the Gram-Schmidt process, which helps create a set of perpendicular vectors.

By using QR decomposition, we can solve tricky math problems more easily. It's especially handy for dealing with systems of equations and finding the best-fit solutions. Think of it as a Swiss Army knife for tackling various linear algebra challenges.

QR Decomposition

Concept and Relation to Gram-Schmidt Process

  • QR decomposition factorizes a matrix A into an orthogonal matrix Q and an upper triangular matrix R, such that A = QR
  • The Gram-Schmidt process constructs an orthonormal basis from a set of linearly independent vectors
  • QR decomposition can be computed using the Gram-Schmidt process by:
    1. Orthonormalizing the columns of the matrix A to form the matrix Q
    2. Calculating the upper triangular matrix R using the orthonormal basis
  • The columns of Q form an orthonormal basis for the column space of A
  • The matrix R represents the coordinates of the columns of A with respect to the orthonormal basis formed by the columns of Q

Applications

  • QR decomposition is useful in various applications:
    • Solving linear systems
    • Least-squares problems
    • Eigenvalue computations
  • QR decomposition provides a numerically stable method for solving these problems
  • It avoids potential issues such as ill-conditioning that may arise in other methods (normal equations)

QR Decomposition with Gram-Schmidt

Performing QR Decomposition

  • To perform QR decomposition using the Gram-Schmidt process:
    1. Initialize the first column of Q as the normalized first column of A
    2. For each subsequent column of A:
      • Subtract its projection onto the previous orthonormal vectors from itself
      • Normalize the resulting vector to obtain the corresponding column of Q
    3. Calculate the entries of the upper triangular matrix R as the dot products of the original columns of A with the corresponding orthonormal vectors in Q
  • The Gram-Schmidt process ensures that the columns of Q are orthonormal (orthogonal to each other and have unit length)
  • The resulting matrices Q and R satisfy the equation A = QR, where Q is an orthogonal matrix and R is an upper triangular matrix

Example

  • Consider the matrix A = [111011001]\begin{bmatrix} 1 & 1 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 1 \end{bmatrix}
  • Applying the Gram-Schmidt process:
    1. The first column of Q is [100]\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}
    2. The second column of Q is [010]\begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}
    3. The third column of Q is [001]\begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}
  • The resulting orthogonal matrix Q is [100010001]\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}
  • The upper triangular matrix R is [111011001]\begin{bmatrix} 1 & 1 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 1 \end{bmatrix}
  • A = QR holds true for this decomposition

Properties of QR Decomposition

Concept and Relation to Gram-Schmidt Process, Linear Algebra/Gram-Schmidt Orthogonalization - Wikibooks, open books for an open world

Uniqueness and Existence

  • QR decomposition is unique for a given matrix A if the diagonal entries of R are chosen to be positive
  • The orthogonal matrix Q in the QR decomposition is not unique, as it can be multiplied by an orthogonal matrix from the right without changing the decomposition
  • If A has full column rank, then the QR decomposition exists and is unique (up to the sign of the diagonal entries of R)

Orthonormality and Rank

  • The columns of Q form an orthonormal basis for the column space of A
  • The rows of Q^T form an orthonormal basis for the row space of A
  • The rank of A is equal to the number of non-zero diagonal entries in R
    • This property can be used to determine the dimension of the column space and the null space of A
  • The orthonormality of Q and the upper triangular structure of R provide useful properties for various applications

Applications of QR Decomposition

Solving Linear Systems

  • QR decomposition can be used to solve linear systems of the form Ax = b:
    1. Compute the QR decomposition of A
    2. Solve the equivalent system Rx = Q^T b using back-substitution
  • This approach is particularly useful for overdetermined systems (more equations than unknowns)
  • QR decomposition provides a numerically stable method for solving linear systems

Least-Squares Problems

  • For overdetermined systems, QR decomposition can be used to find the least-squares solution
    • The least-squares solution minimizes the Euclidean norm of the residual vector
  • The least-squares solution can be obtained by solving the normal equations (A^T A)x = A^T b
    • QR decomposition can efficiently solve the normal equations
  • QR decomposition avoids the potential ill-conditioning of the normal equations, making it a preferred method for solving least-squares problems
  • When A has full column rank, the least-squares solution obtained using QR decomposition is unique and minimizes the Euclidean norm of the residual vector

Example

  • Consider the overdetermined system Ax = b, where A = [123456]\begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix} and b = [123]\begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}
  • Computing the QR decomposition of A:
    • Q = [0.16900.84520.50710.50710.50710.69770.84520.16900.5071]\begin{bmatrix} -0.1690 & -0.8452 & -0.5071 \\ -0.5071 & 0.5071 & -0.6977 \\ -0.8452 & 0.1690 & 0.5071 \end{bmatrix}
    • R = [5.91617.437400.8452]\begin{bmatrix} -5.9161 & -7.4374 \\ 0 & -0.8452 \end{bmatrix}
  • Solving Rx = Q^T b:
    • Q^T b = [3.16230.5916]\begin{bmatrix} -3.1623 \\ 0.5916 \end{bmatrix}
    • Back-substitution yields x = [0.50.2]\begin{bmatrix} 0.5 \\ 0.2 \end{bmatrix}
  • The least-squares solution is x = [0.50.2]\begin{bmatrix} 0.5 \\ 0.2 \end{bmatrix}, which minimizes the Euclidean norm of the residual vector
Pep mascot
Upgrade your Fiveable account to print any study guide

Download study guides as beautiful PDFs See example

Print or share PDFs with your students

Always prints our latest, updated content

Mark up and annotate as you study

Click below to go to billing portal → update your plan → choose Yearly → and select "Fiveable Share Plan". Only pay the difference

Plan is open to all students, teachers, parents, etc
Pep mascot
Upgrade your Fiveable account to export vocabulary

Download study guides as beautiful PDFs See example

Print or share PDFs with your students

Always prints our latest, updated content

Mark up and annotate as you study

Plan is open to all students, teachers, parents, etc
report an error
description

screenshots help us find and fix the issue faster (optional)

add screenshot

2,589 studying →