Fiveable

🧚🏽‍♀️Abstract Linear Algebra I Unit 8 Review

QR code for Abstract Linear Algebra I practice questions

8.2 Orthogonality and Orthonormal Bases

8.2 Orthogonality and Orthonormal Bases

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🧚🏽‍♀️Abstract Linear Algebra I
Unit & Topic Study Guides

Orthogonality and orthonormal bases are key concepts in inner product spaces. They help us understand how vectors relate to each other and provide powerful tools for simplifying calculations and solving problems.

These ideas are crucial for many applications in math and science. Orthonormal bases make it easy to express vectors as linear combinations, while orthogonal projections help us find the closest point in a subspace to a given vector.

Orthogonality and Orthonormality

Defining Orthogonality and Orthonormality

  • Two vectors are orthogonal if their inner product (or dot product) is equal to zero
    • For example, the vectors v=(1,0)v = (1, 0) and w=(0,1)w = (0, 1) are orthogonal because vw=1×0+0×1=0v \cdot w = 1 \times 0 + 0 \times 1 = 0
  • A set of vectors is orthogonal if every pair of distinct vectors in the set is orthogonal
    • An example of an orthogonal set is {(1,0,0),(0,1,0),(0,0,1)}\{(1, 0, 0), (0, 1, 0), (0, 0, 1)\} in R3\mathbb{R}^3
  • A set of vectors is orthonormal if it is orthogonal and each vector has a norm (length) of 1
    • The set {(12,12),(12,12)}\{(\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}), (\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{2}})\} is orthonormal because the vectors are orthogonal and have unit length
  • The standard basis vectors (e.g., i,j,ki, j, k in R3\mathbb{R}^3) form an orthonormal set

Properties of Orthogonal and Orthonormal Sets

  • Orthogonal and orthonormal sets of vectors are linearly independent
    • This means that no vector in the set can be expressed as a linear combination of the other vectors
  • Orthonormal sets have the property that the inner product of any vector with itself is 1, and the inner product of any two distinct vectors is 0
    • For an orthonormal set {e1,e2,,en}\{e_1, e_2, \ldots, e_n\}, we have eiej={1,i=j0,ije_i \cdot e_j = \begin{cases} 1, & i = j \\ 0, & i \neq j \end{cases}
  • Orthonormal bases are particularly useful because they simplify many calculations and have desirable geometric properties
    • For example, the Euclidean distance between two vectors can be easily computed using their coefficients with respect to an orthonormal basis

Gram-Schmidt Process

Defining Orthogonality and Orthonormality, Standard basis - Wikipedia

Algorithm for Constructing Orthonormal Bases

  • The Gram-Schmidt process is an algorithm for constructing an orthonormal basis from a linearly independent set of vectors
  • The process involves iteratively projecting each vector onto the orthogonal complement of the subspace spanned by the previous orthonormal vectors and normalizing the result
    1. The first vector in the orthonormal basis is obtained by normalizing the first vector in the original set: u1=v1v1u_1 = \frac{v_1}{||v_1||}
    2. Each subsequent vector in the orthonormal basis is obtained by subtracting its projection onto the subspace spanned by the previous orthonormal vectors and then normalizing the result: uk=vki=1k1(vkui)uivki=1k1(vkui)uiu_k = \frac{v_k - \sum_{i=1}^{k-1} (v_k \cdot u_i) u_i}{||v_k - \sum_{i=1}^{k-1} (v_k \cdot u_i) u_i||}
  • The resulting set of vectors {u1,u2,,un}\{u_1, u_2, \ldots, u_n\} forms an orthonormal basis for the same subspace as the original linearly independent set {v1,v2,,vn}\{v_1, v_2, \ldots, v_n\}

Example of Gram-Schmidt Process

  • Consider the linearly independent set {(1,1,1),(1,0,2),(1,2,0)}\{(1, 1, 1), (1, 0, 2), (1, 2, 0)\} in R3\mathbb{R}^3
  • Applying the Gram-Schmidt process:
    1. u1=(1,1,1)(1,1,1)=(13,13,13)u_1 = \frac{(1, 1, 1)}{||(1, 1, 1)||} = (\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}})
    2. u2=(1,0,2)((1,0,2)u1)u1(1,0,2)((1,0,2)u1)u1=(12,16,13)u_2 = \frac{(1, 0, 2) - ((1, 0, 2) \cdot u_1)u_1}{||(1, 0, 2) - ((1, 0, 2) \cdot u_1)u_1||} = (\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{6}}, \frac{1}{\sqrt{3}})
    3. u3=(1,2,0)((1,2,0)u1)u1((1,2,0)u2)u2(1,2,0)((1,2,0)u1)u1((1,2,0)u2)u2=(0,25,15)u_3 = \frac{(1, 2, 0) - ((1, 2, 0) \cdot u_1)u_1 - ((1, 2, 0) \cdot u_2)u_2}{||(1, 2, 0) - ((1, 2, 0) \cdot u_1)u_1 - ((1, 2, 0) \cdot u_2)u_2||} = (0, \frac{2}{\sqrt{5}}, -\frac{1}{\sqrt{5}})
  • The resulting orthonormal basis is {(13,13,13),(12,16,13),(0,25,15)}\{(\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}), (\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{6}}, \frac{1}{\sqrt{3}}), (0, \frac{2}{\sqrt{5}}, -\frac{1}{\sqrt{5}})\}

Uniqueness of Orthogonal Projections

Defining Orthogonality and Orthonormality, Cross product - Wikiversity

Definition and Formula for Orthogonal Projections

  • The orthogonal projection of a vector onto a subspace is the closest point in the subspace to the vector, where closeness is measured by the Euclidean distance
  • The orthogonal projection of a vector vv onto a subspace WW can be expressed as the sum of its projections onto each basis vector of WW
  • The projection of vv onto a basis vector ww is given by the formula: projw(v)=vww2wproj_w(v) = \frac{v \cdot w}{||w||^2} w, where \cdot denotes the inner product and w||w|| is the norm of ww
    • For example, if v=(1,2,3)v = (1, 2, 3) and w=(1,0,1)w = (1, 0, 1), then projw(v)=(1,2,3)(1,0,1)(1,0,1)2(1,0,1)=(2,0,2)proj_w(v) = \frac{(1, 2, 3) \cdot (1, 0, 1)}{||(1, 0, 1)||^2} (1, 0, 1) = (2, 0, 2)

Proof of Uniqueness

  • To prove uniqueness, suppose there are two different orthogonal projections of vv onto WW, denoted by p1p_1 and p2p_2
  • The difference vector d=p1p2d = p_1 - p_2 must be orthogonal to every vector in WW, including itself
    • This is because p1p_1 and p2p_2 are both in WW, so their difference must also be in WW, and any vector in WW is orthogonal to itself
  • Since dd is orthogonal to itself, we have dd=0d \cdot d = 0, which implies that d=0d = 0 and p1=p2p_1 = p_2
  • Therefore, the orthogonal projection of a vector onto a subspace is unique

Linear Combinations of Orthonormal Bases

Expressing Vectors as Linear Combinations

  • Any vector in a vector space can be expressed as a linear combination of the basis vectors for that space
  • When the basis vectors form an orthonormal set, the coefficients in the linear combination are given by the inner products of the vector with each basis vector
    • For a vector vv and an orthonormal basis {e1,e2,,en}\{e_1, e_2, \ldots, e_n\}, vv can be expressed as: v=(ve1)e1+(ve2)e2++(ven)env = (v \cdot e_1)e_1 + (v \cdot e_2)e_2 + \ldots + (v \cdot e_n)e_n
    • For example, if v=(1,2,3)v = (1, 2, 3) and the orthonormal basis is {(13,13,13),(12,12,0),(16,16,26)}\{(\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}), (\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{2}}, 0), (\frac{1}{\sqrt{6}}, \frac{1}{\sqrt{6}}, -\frac{2}{\sqrt{6}})\}, then v=6(13,13,13)+2(12,12,0)+3(16,16,26)v = \sqrt{6}(\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}) + \sqrt{2}(\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{2}}, 0) + \sqrt{3}(\frac{1}{\sqrt{6}}, \frac{1}{\sqrt{6}}, -\frac{2}{\sqrt{6}})

Fourier Coefficients and Applications

  • The coefficients (ve1),(ve2),,(ven)(v \cdot e_1), (v \cdot e_2), \ldots, (v \cdot e_n) are called the Fourier coefficients of vv with respect to the orthonormal basis
  • Expressing vectors as linear combinations of orthonormal basis vectors simplifies many computations and is fundamental to various applications
    • In signal processing, the Fourier coefficients represent the amplitudes of different frequency components of a signal
    • In quantum mechanics, the state of a quantum system can be described as a linear combination of orthonormal basis states, with the coefficients representing the probability amplitudes of each state
  • The Fourier coefficients provide a way to analyze and manipulate vectors in terms of their projections onto orthogonal directions, which can reveal important properties and relationships