Fiveable

๐ŸงFunctional Analysis Unit 5 Review

QR code for Functional Analysis practice questions

5.3 Orthogonality, projections, and Gram-Schmidt process

5.3 Orthogonality, projections, and Gram-Schmidt process

Written by the Fiveable Content Team โ€ข Last updated August 2025
Written by the Fiveable Content Team โ€ข Last updated August 2025
๐ŸงFunctional Analysis
Unit & Topic Study Guides

Orthogonality in vector spaces is all about perpendicular relationships between vectors. It's a key concept in inner product spaces and Hilbert spaces, where we can measure angles and distances between vectors using inner products.

Orthogonal projections are super useful for finding the closest point in a subspace to a given vector. We can compute these projections using inner products and orthonormal bases, which helps solve problems in many fields like signal processing and data analysis.

Orthogonality and Inner Product Spaces

Orthogonality in vector spaces

  • Orthogonality is a geometric concept describing the perpendicular relationship between vectors in inner product spaces and Hilbert spaces
  • Two vectors xx and yy are orthogonal if their inner product equals zero: โŸจx,yโŸฉ=0\langle x, y \rangle = 0
  • Inner product spaces are vector spaces with an inner product function assigning a scalar value to each pair of vectors
    • Inner product satisfies properties for all vectors x,y,zx, y, z and scalars aa:
      • Conjugate symmetry: โŸจx,yโŸฉ=โŸจy,xโŸฉโ€พ\langle x, y \rangle = \overline{\langle y, x \rangle}
      • Linearity in the second argument: โŸจx,ay+zโŸฉ=aโŸจx,yโŸฉ+โŸจx,zโŸฉ\langle x, ay + z \rangle = a\langle x, y \rangle + \langle x, z \rangle
      • Positive definiteness: โŸจx,xโŸฉโ‰ฅ0\langle x, x \rangle \geq 0 and โŸจx,xโŸฉ=0\langle x, x \rangle = 0 if and only if x=0x = 0
  • Hilbert spaces are complete inner product spaces where every Cauchy sequence converges to an element within the space
    • Completeness allows application of powerful tools and theorems in functional analysis (L2L^2 spaces, Fourier series)

Computation of orthogonal projections

  • Orthogonal projections find the closest point in a subspace to a given vector
  • The orthogonal projection of vector xx onto closed subspace MM is the unique vector PM(x)P_M(x) in MM minimizing distance between xx and any vector in MM
    • Mathematically: โˆฅxโˆ’PM(x)โˆฅ=minโกyโˆˆMโˆฅxโˆ’yโˆฅ\|x - P_M(x)\| = \min_{y \in M} \|x - y\|
  • To compute the orthogonal projection of vector xx onto subspace MM spanned by orthonormal basis {e1,e2,โ€ฆ,en}\{e_1, e_2, \ldots, e_n\}:
    • PM(x)=โˆ‘i=1nโŸจx,eiโŸฉeiP_M(x) = \sum_{i=1}^n \langle x, e_i \rangle e_i
  • Vector xโˆ’PM(x)x - P_M(x) is orthogonal to every vector in MM, representing the component of xx perpendicular to the subspace (residual, error term)
Orthogonality in vector spaces, Inner product space - Wikipedia

Gram-Schmidt Process and Orthonormal Bases

Gram-Schmidt process for orthonormal bases

  • The Gram-Schmidt process constructs an orthonormal basis from a linearly independent set of vectors
  • Given linearly independent set {v1,v2,โ€ฆ,vn}\{v_1, v_2, \ldots, v_n\}:
    1. Define e1=v1โˆฅv1โˆฅe_1 = \frac{v_1}{\|v_1\|}
    2. For i=2,3,โ€ฆ,ni = 2, 3, \ldots, n:
      • Compute ui=viโˆ’โˆ‘j=1iโˆ’1โŸจvi,ejโŸฉeju_i = v_i - \sum_{j=1}^{i-1} \langle v_i, e_j \rangle e_j
      • Define ei=uiโˆฅuiโˆฅe_i = \frac{u_i}{\|u_i\|}
  • The resulting set {e1,e2,โ€ฆ,en}\{e_1, e_2, \ldots, e_n\} is an orthonormal basis for the space spanned by the original linearly independent set
  • Orthonormal bases simplify calculations and provide a convenient coordinate system (QR factorization, least squares)

Properties of orthogonal projections

  • In Hilbert spaces, orthogonal projections onto closed subspaces always exist and are unique
  • Existence proof:
    • Let MM be a closed subspace of Hilbert space HH and xโˆˆHx \in H
    • Show set {xโˆ’y:yโˆˆM}\{x - y : y \in M\} contains a unique element of minimal norm using parallelogram law and completeness of MM
    • This element is the orthogonal projection of xx onto MM
  • Uniqueness proof:
    • Suppose two orthogonal projections PM(x)P_M(x) and QM(x)Q_M(x) of xx onto MM exist
    • Show โˆฅPM(x)โˆ’QM(x)โˆฅ2=0\|P_M(x) - Q_M(x)\|^2 = 0 using properties of orthogonal projections and Pythagorean theorem
    • Conclude PM(x)=QM(x)P_M(x) = Q_M(x), proving uniqueness
  • Existence and uniqueness of orthogonal projections underlie many applications (least squares approximation, signal processing)