Inner product spaces let us measure angles and distances between vectors. This section dives into orthogonal complements - subspaces containing vectors perpendicular to a given subspace. We'll see how these complements relate to the original space's structure.
We'll also explore orthogonal projections, which find the closest vector in a subspace to a given vector. This concept is crucial for solving least-squares problems and decomposing vectors into orthogonal components.
Orthogonal Complements and Properties
Definition and Basic Properties
Top images from around the web for Definition and Basic Properties linear algebra - $\mathbb{C}^3$: Orthogonal Complement - Mathematics Stack Exchange View original
Is this image relevant?
Inner product space - Wikipedia View original
Is this image relevant?
linear algebra - Relation between Interior Product, Inner Product, Exterior Product, Outer ... View original
Is this image relevant?
linear algebra - $\mathbb{C}^3$: Orthogonal Complement - Mathematics Stack Exchange View original
Is this image relevant?
Inner product space - Wikipedia View original
Is this image relevant?
1 of 3
Top images from around the web for Definition and Basic Properties linear algebra - $\mathbb{C}^3$: Orthogonal Complement - Mathematics Stack Exchange View original
Is this image relevant?
Inner product space - Wikipedia View original
Is this image relevant?
linear algebra - Relation between Interior Product, Inner Product, Exterior Product, Outer ... View original
Is this image relevant?
linear algebra - $\mathbb{C}^3$: Orthogonal Complement - Mathematics Stack Exchange View original
Is this image relevant?
Inner product space - Wikipedia View original
Is this image relevant?
1 of 3
Orthogonal complement of subspace W in inner product space V denoted W⊥ contains all vectors in V orthogonal to every vector in W
For finite-dimensional inner product space V and subspace W, (W⊥)⊥ = W
Dimension relationship dim(W) + dim(W⊥) = dim(V) holds for finite-dimensional inner product spaces
Orthogonal complement of a subspace forms a subspace of the inner product space
Zero subspace {0} has orthogonal complement V, and V has orthogonal complement {0}
Properties Involving Multiple Subspaces
For subspaces U and W in V, (U + W)⊥ = U⊥ ∩ W⊥
Intersection and sum relationship (U ∩ W)⊥ = U⊥ + W⊥ holds for subspaces U and W
Linear transformation T : V → W between inner product spaces yields ker(T*) = (range(T))⊥
Adjoint T* of linear transformation T satisfies range(T*) = (ker(T))⊥
Orthogonal Projections onto Subspaces
Computation and Properties
Orthogonal projection of vector v onto subspace W represents closest vector in W to v
Formula for orthogonal projection onto W with orthonormal basis {u₁, ..., uₖ} p r o j W ( v ) = ∑ i = 1 k ⟨ v , u i ⟩ u i proj_W(v) = \sum_{i=1}^k \langle v, u_i \rangle u_i p ro j W ( v ) = ∑ i = 1 k ⟨ v , u i ⟩ u i
Projection matrix P for orthogonal projection onto W P = U ( U ∗ U ) − 1 U ∗ P = U(U^*U)^{-1}U^* P = U ( U ∗ U ) − 1 U ∗ where U forms basis for W
Orthogonal projection P onto W satisfies P² = P (idempotent) and P* = P (self-adjoint)
Orthogonal projection onto W⊥ calculated as v - proj_W(v)
Error vector e = v - proj_W(v) orthogonal to all vectors in W
Orthogonal Decomposition
Vector v in V decomposes as v = proj_W(v) + proj_W⊥(v)
Decomposition represents v as sum of components in W and W⊥
Orthogonal decomposition unique for each vector v in V
Orthogonal Decomposition Theorem
Statement and Proof
Theorem states every vector v in V uniquely expressed as v = w + w⊥, w in W and w⊥ in W⊥
Existence proof constructs w = proj_W(v) and w⊥ = v - proj_W(v)
Uniqueness proof assumes two decompositions v = w₁ + w₁⊥ = w₂ + w₂⊥ and shows w₁ = w₂ and w₁⊥ = w₂⊥
Proof utilizes inner product properties and orthogonal complement definition
Implications and Applications
Theorem implies V = W ⊕ W⊥ (direct sum) for any subspace W of V
Unique decomposition of vectors into projections onto W and W⊥
Fundamental theorem in understanding inner product space structure
Applications in linear algebra, functional analysis, and quantum mechanics (state vector decomposition)
Least-Squares Problems with Projections
Least-squares problem minimizes ||Ax - b||² for matrix A and vector b
Normal equations AAx = A b characterize least-squares solution
Solution x̂ given by x ^ = ( A ∗ A ) − 1 A ∗ b \hat{x} = (A^*A)^{-1}A^*b x ^ = ( A ∗ A ) − 1 A ∗ b when A*A invertible
Geometrically, solution finds closest vector Ax̂ in column space of A to b
Orthogonal projection of b onto column space of A p r o j c o l ( A ) ( b ) = A ( A ∗ A ) − 1 A ∗ b proj_{col(A)}(b) = A(A^*A)^{-1}A^*b p ro j co l ( A ) ( b ) = A ( A ∗ A ) − 1 A ∗ b
Residual vector r = b - Ax̂ orthogonal to column space of A
Applications and Significance
Data fitting applications (linear regression, polynomial fitting)
Model matrix A represents predictor variables, b represents observed data
Signal processing uses (noise reduction, signal approximation)
Parameter estimation in scientific and engineering fields (system identification)
Statistical analysis applications (ANOVA, multiple regression)