Abstract Linear Algebra I

🧚🏽‍♀️Abstract Linear Algebra I Unit 9 – Gram-Schmidt Orthogonalization

Gram-Schmidt Orthogonalization is a powerful method for creating orthonormal bases from linearly independent vectors. It's used in many fields, from quantum mechanics to computer graphics, to simplify calculations and improve numerical stability. The process involves projecting vectors onto previously orthogonalized vectors and subtracting these projections. This creates a set of mutually perpendicular unit vectors that span the same space as the original set, making complex problems more manageable.

Got a Unit Test this week?

we crunched the numbers and here's the most likely topics on your next test

Key Concepts and Definitions

  • Orthogonality: Two vectors u\mathbf{u} and v\mathbf{v} are orthogonal if their inner product u,v=0\langle \mathbf{u}, \mathbf{v} \rangle = 0
  • Orthonormal set: A set of vectors {v1,v2,,vn}\{\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n\} is orthonormal if each vector has unit length and any two distinct vectors are orthogonal
    • Mathematically, vi,vj=δij\langle \mathbf{v}_i, \mathbf{v}_j \rangle = \delta_{ij}, where δij\delta_{ij} is the Kronecker delta
  • Orthonormal basis: An orthonormal set that spans the entire vector space
  • Inner product: A generalization of the dot product that measures the similarity between two vectors
    • Denoted as u,v\langle \mathbf{u}, \mathbf{v} \rangle for vectors u\mathbf{u} and v\mathbf{v}
  • Projection: The process of decomposing a vector into components parallel and perpendicular to a given direction
    • The projection of u\mathbf{u} onto v\mathbf{v} is given by projvu=u,vv,vv\text{proj}_{\mathbf{v}}\mathbf{u} = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\langle \mathbf{v}, \mathbf{v} \rangle}\mathbf{v}
  • Linear independence: A set of vectors is linearly independent if no vector can be expressed as a linear combination of the others
  • Span: The set of all linear combinations of a given set of vectors

Historical Context and Applications

  • Gram-Schmidt process developed by Jørgen Pedersen Gram and Erhard Schmidt in the early 20th century
  • Widely used in various fields of mathematics, physics, and engineering
  • Signal processing: Orthogonal signals minimize interference and simplify analysis
    • Orthogonal frequency-division multiplexing (OFDM) in wireless communications
  • Quantum mechanics: Orthonormal bases represent quantum states and simplify calculations
    • Eigenstates of observables form orthonormal bases in Hilbert spaces
  • Numerical linear algebra: Orthogonalization improves stability and accuracy of algorithms
    • Least squares problems and matrix decompositions (QR decomposition)
  • Computer graphics: Orthonormal bases simplify transformations and lighting calculations
  • Cryptography: Orthogonal matrices used in designing secure encryption schemes

The Gram-Schmidt Process: Step-by-Step

  1. Start with a linearly independent set of vectors {v1,v2,,vn}\{\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n\}
  2. Set u1=v1\mathbf{u}_1 = \mathbf{v}_1 and normalize it: e1=u1u1\mathbf{e}_1 = \frac{\mathbf{u}_1}{\|\mathbf{u}_1\|}
  3. For i=2,3,,ni = 2, 3, \ldots, n:
    • Compute the projection of vi\mathbf{v}_i onto each previous orthonormal vector: projejvi=vi,ejej,ejej\text{proj}_{\mathbf{e}_j}\mathbf{v}_i = \frac{\langle \mathbf{v}_i, \mathbf{e}_j \rangle}{\langle \mathbf{e}_j, \mathbf{e}_j \rangle}\mathbf{e}_j for j=1,2,,i1j = 1, 2, \ldots, i-1
    • Subtract the projections from vi\mathbf{v}_i to obtain ui\mathbf{u}_i: ui=vij=1i1projejvi\mathbf{u}_i = \mathbf{v}_i - \sum_{j=1}^{i-1}\text{proj}_{\mathbf{e}_j}\mathbf{v}_i
    • Normalize ui\mathbf{u}_i to get the next orthonormal vector: ei=uiui\mathbf{e}_i = \frac{\mathbf{u}_i}{\|\mathbf{u}_i\|}
  4. The resulting set {e1,e2,,en}\{\mathbf{e}_1, \mathbf{e}_2, \ldots, \mathbf{e}_n\} is an orthonormal basis for the span of the original set

Mathematical Properties and Theorems

  • Gram-Schmidt process preserves the span of the original set of vectors
    • The orthonormal basis generated spans the same subspace as the input vectors
  • The Gram-Schmidt process produces a unique orthonormal basis for a given input set
    • The order of the input vectors may affect the resulting basis, but the subspace spanned remains the same
  • Orthogonal projection theorem: The projection of a vector onto a subspace is the closest point in the subspace to the vector
    • Minimizes the distance between the vector and its projection
  • Pythagorean theorem in inner product spaces: For orthogonal vectors u\mathbf{u} and v\mathbf{v}, u+v2=u2+v2\|\mathbf{u} + \mathbf{v}\|^2 = \|\mathbf{u}\|^2 + \|\mathbf{v}\|^2
  • Parseval's identity: The sum of the squares of the inner products of a vector with an orthonormal basis equals the square of the vector's norm
    • i=1nv,ei2=v2\sum_{i=1}^{n}|\langle \mathbf{v}, \mathbf{e}_i \rangle|^2 = \|\mathbf{v}\|^2 for an orthonormal basis {e1,e2,,en}\{\mathbf{e}_1, \mathbf{e}_2, \ldots, \mathbf{e}_n\}

Practical Examples and Problem-Solving

  • Example 1: Orthonormalizing a set of vectors in R3\mathbb{R}^3
    • Given vectors v1=(1,0,1)\mathbf{v}_1 = (1, 0, 1), v2=(1,1,0)\mathbf{v}_2 = (1, 1, 0), and v3=(1,1,1)\mathbf{v}_3 = (1, 1, 1), find an orthonormal basis using the Gram-Schmidt process
  • Example 2: Orthogonal decomposition of a vector
    • Given a vector v=(2,3,4)\mathbf{v} = (2, 3, 4) and an orthonormal basis {e1,e2,e3}\{\mathbf{e}_1, \mathbf{e}_2, \mathbf{e}_3\}, find the orthogonal decomposition of v\mathbf{v} with respect to the basis
  • Example 3: Least squares approximation using orthogonal projections
    • Given a set of data points (xi,yi)(x_i, y_i) and a set of basis functions {f1(x),f2(x),,fn(x)}\{f_1(x), f_2(x), \ldots, f_n(x)\}, find the best linear combination of the basis functions that approximates the data using orthogonal projections
  • Problem-solving strategy: Identify the input vectors, apply the Gram-Schmidt process step-by-step, and interpret the results in the context of the problem

Common Mistakes and Pitfalls

  • Forgetting to normalize the orthogonal vectors at each step
    • Normalization ensures that the resulting vectors have unit length and form an orthonormal basis
  • Incorrectly calculating the projections or subtractions
    • Double-check the formulas for projections and ensure that the subtractions are performed correctly
  • Attempting to orthogonalize a linearly dependent set of vectors
    • The Gram-Schmidt process assumes that the input vectors are linearly independent
    • If the input vectors are linearly dependent, the process will encounter division by zero or produce non-orthogonal vectors
  • Confusing orthogonality with linear independence
    • Orthogonality is a stronger condition than linear independence
    • An orthogonal set of vectors is always linearly independent, but a linearly independent set may not be orthogonal
  • Misinterpreting the order of the resulting orthonormal basis
    • The order of the input vectors affects the order of the output basis, but not the subspace spanned

Advanced Topics and Extensions

  • Gram-Schmidt process in infinite-dimensional Hilbert spaces
    • Extends the concept of orthonormalization to function spaces and sequences
  • Modified Gram-Schmidt process: A numerically stable variant that updates the projections iteratively
    • Improves the accuracy and stability of the orthogonalization process in the presence of rounding errors
  • Householder reflections: An alternative method for orthogonalization based on reflections across hyperplanes
    • Used in the QR decomposition and other numerical linear algebra algorithms
  • Orthogonal polynomials: Polynomial sequences that are orthogonal with respect to a given inner product
    • Examples include Legendre polynomials, Chebyshev polynomials, and Hermite polynomials
  • Orthogonal matrix factorizations: Decompositions of matrices into products of orthogonal matrices
    • Singular Value Decomposition (SVD), QR decomposition, and Schur decomposition

Review and Practice Problems

  1. Prove that the Gram-Schmidt process produces an orthonormal basis for a given linearly independent input set.
  2. Apply the Gram-Schmidt process to the vectors v1=(1,1,1)\mathbf{v}_1 = (1, 1, 1), v2=(0,1,1)\mathbf{v}_2 = (0, 1, 1), and v3=(0,0,1)\mathbf{v}_3 = (0, 0, 1) to find an orthonormal basis for R3\mathbb{R}^3.
  3. Given an orthonormal basis {e1,e2,e3}\{\mathbf{e}_1, \mathbf{e}_2, \mathbf{e}_3\} and a vector v=(2,1,3)\mathbf{v} = (2, -1, 3), find the orthogonal projection of v\mathbf{v} onto the subspace spanned by e1\mathbf{e}_1 and e2\mathbf{e}_2.
  4. Prove the Pythagorean theorem in inner product spaces for orthogonal vectors.
  5. Explain why the Gram-Schmidt process may fail when applied to a linearly dependent set of vectors.
  6. Compare and contrast the Gram-Schmidt process with the Householder reflections method for orthogonalization.
  7. Discuss the importance of orthogonality in quantum mechanics and provide an example of an orthonormal basis in a Hilbert space.
  8. Prove Parseval's identity for an orthonormal basis in an inner product space.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.