Orthogonality and orthonormal bases are key concepts in inner product spaces. They help us understand how vectors relate to each other and provide powerful tools for simplifying calculations and solving problems.

These ideas are crucial for many applications in math and science. Orthonormal bases make it easy to express vectors as linear combinations, while orthogonal help us find the closest point in a subspace to a given vector.

Orthogonality and Orthonormality

Defining Orthogonality and Orthonormality

Top images from around the web for Defining Orthogonality and Orthonormality
Top images from around the web for Defining Orthogonality and Orthonormality
  • Two vectors are orthogonal if their inner product (or ) is equal to zero
    • For example, the vectors v=(1,0)v = (1, 0) and w=(0,1)w = (0, 1) are orthogonal because vw=1×0+0×1=0v \cdot w = 1 \times 0 + 0 \times 1 = 0
  • A set of vectors is orthogonal if every pair of distinct vectors in the set is orthogonal
    • An example of an orthogonal set is {(1,0,0),(0,1,0),(0,0,1)}\{(1, 0, 0), (0, 1, 0), (0, 0, 1)\} in R3\mathbb{R}^3
  • A set of vectors is orthonormal if it is orthogonal and each vector has a norm (length) of 1
    • The set {(12,12),(12,12)}\{(\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}), (\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{2}})\} is orthonormal because the vectors are orthogonal and have unit length
  • The standard basis vectors (e.g., i,j,ki, j, k in R3\mathbb{R}^3) form an orthonormal set

Properties of Orthogonal and Orthonormal Sets

  • Orthogonal and of vectors are linearly independent
    • This means that no vector in the set can be expressed as a linear combination of the other vectors
  • Orthonormal sets have the property that the inner product of any vector with itself is 1, and the inner product of any two distinct vectors is 0
    • For an orthonormal set {e1,e2,,en}\{e_1, e_2, \ldots, e_n\}, we have eiej={1,i=j0,ije_i \cdot e_j = \begin{cases} 1, & i = j \\ 0, & i \neq j \end{cases}
  • Orthonormal bases are particularly useful because they simplify many calculations and have desirable geometric properties
    • For example, the Euclidean distance between two vectors can be easily computed using their coefficients with respect to an orthonormal basis

Gram-Schmidt Process

Algorithm for Constructing Orthonormal Bases

  • The is an algorithm for constructing an orthonormal basis from a linearly independent set of vectors
  • The process involves iteratively projecting each vector onto the orthogonal complement of the subspace spanned by the previous orthonormal vectors and normalizing the result
    1. The first vector in the orthonormal basis is obtained by normalizing the first vector in the original set: u1=v1v1u_1 = \frac{v_1}{||v_1||}
    2. Each subsequent vector in the orthonormal basis is obtained by subtracting its projection onto the subspace spanned by the previous orthonormal vectors and then normalizing the result: uk=vki=1k1(vkui)uivki=1k1(vkui)uiu_k = \frac{v_k - \sum_{i=1}^{k-1} (v_k \cdot u_i) u_i}{||v_k - \sum_{i=1}^{k-1} (v_k \cdot u_i) u_i||}
  • The resulting set of vectors {u1,u2,,un}\{u_1, u_2, \ldots, u_n\} forms an orthonormal basis for the same subspace as the original linearly independent set {v1,v2,,vn}\{v_1, v_2, \ldots, v_n\}

Example of Gram-Schmidt Process

  • Consider the linearly independent set {(1,1,1),(1,0,2),(1,2,0)}\{(1, 1, 1), (1, 0, 2), (1, 2, 0)\} in R3\mathbb{R}^3
  • Applying the Gram-Schmidt process:
    1. u1=(1,1,1)(1,1,1)=(13,13,13)u_1 = \frac{(1, 1, 1)}{||(1, 1, 1)||} = (\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}})
    2. u2=(1,0,2)((1,0,2)u1)u1(1,0,2)((1,0,2)u1)u1=(12,16,13)u_2 = \frac{(1, 0, 2) - ((1, 0, 2) \cdot u_1)u_1}{||(1, 0, 2) - ((1, 0, 2) \cdot u_1)u_1||} = (\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{6}}, \frac{1}{\sqrt{3}})
    3. u3=(1,2,0)((1,2,0)u1)u1((1,2,0)u2)u2(1,2,0)((1,2,0)u1)u1((1,2,0)u2)u2=(0,25,15)u_3 = \frac{(1, 2, 0) - ((1, 2, 0) \cdot u_1)u_1 - ((1, 2, 0) \cdot u_2)u_2}{||(1, 2, 0) - ((1, 2, 0) \cdot u_1)u_1 - ((1, 2, 0) \cdot u_2)u_2||} = (0, \frac{2}{\sqrt{5}}, -\frac{1}{\sqrt{5}})
  • The resulting orthonormal basis is {(13,13,13),(12,16,13),(0,25,15)}\{(\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}), (\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{6}}, \frac{1}{\sqrt{3}}), (0, \frac{2}{\sqrt{5}}, -\frac{1}{\sqrt{5}})\}

Uniqueness of Orthogonal Projections

Definition and Formula for Orthogonal Projections

  • The orthogonal projection of a vector onto a subspace is the closest point in the subspace to the vector, where closeness is measured by the Euclidean distance
  • The orthogonal projection of a vector vv onto a subspace WW can be expressed as the sum of its projections onto each basis vector of WW
  • The projection of vv onto a basis vector ww is given by the formula: projw(v)=vww2wproj_w(v) = \frac{v \cdot w}{||w||^2} w, where \cdot denotes the inner product and w||w|| is the norm of ww
    • For example, if v=(1,2,3)v = (1, 2, 3) and w=(1,0,1)w = (1, 0, 1), then projw(v)=(1,2,3)(1,0,1)(1,0,1)2(1,0,1)=(2,0,2)proj_w(v) = \frac{(1, 2, 3) \cdot (1, 0, 1)}{||(1, 0, 1)||^2} (1, 0, 1) = (2, 0, 2)

Proof of Uniqueness

  • To prove uniqueness, suppose there are two different orthogonal projections of vv onto WW, denoted by p1p_1 and p2p_2
  • The difference vector d=p1p2d = p_1 - p_2 must be orthogonal to every vector in WW, including itself
    • This is because p1p_1 and p2p_2 are both in WW, so their difference must also be in WW, and any vector in WW is orthogonal to itself
  • Since dd is orthogonal to itself, we have dd=0d \cdot d = 0, which implies that d=0d = 0 and p1=p2p_1 = p_2
  • Therefore, the orthogonal projection of a vector onto a subspace is unique

Linear Combinations of Orthonormal Bases

Expressing Vectors as Linear Combinations

  • Any vector in a vector space can be expressed as a linear combination of the basis vectors for that space
  • When the basis vectors form an orthonormal set, the coefficients in the linear combination are given by the inner products of the vector with each basis vector
    • For a vector vv and an orthonormal basis {e1,e2,,en}\{e_1, e_2, \ldots, e_n\}, vv can be expressed as: v=(ve1)e1+(ve2)e2++(ven)env = (v \cdot e_1)e_1 + (v \cdot e_2)e_2 + \ldots + (v \cdot e_n)e_n
    • For example, if v=(1,2,3)v = (1, 2, 3) and the orthonormal basis is {(13,13,13),(12,12,0),(16,16,26)}\{(\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}), (\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{2}}, 0), (\frac{1}{\sqrt{6}}, \frac{1}{\sqrt{6}}, -\frac{2}{\sqrt{6}})\}, then v=6(13,13,13)+2(12,12,0)+3(16,16,26)v = \sqrt{6}(\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}) + \sqrt{2}(\frac{1}{\sqrt{2}}, -\frac{1}{\sqrt{2}}, 0) + \sqrt{3}(\frac{1}{\sqrt{6}}, \frac{1}{\sqrt{6}}, -\frac{2}{\sqrt{6}})

Fourier Coefficients and Applications

  • The coefficients (ve1),(ve2),,(ven)(v \cdot e_1), (v \cdot e_2), \ldots, (v \cdot e_n) are called the Fourier coefficients of vv with respect to the orthonormal basis
  • Expressing vectors as linear combinations of orthonormal basis vectors simplifies many computations and is fundamental to various applications
    • In signal processing, the Fourier coefficients represent the amplitudes of different frequency components of a signal
    • In quantum mechanics, the state of a quantum system can be described as a linear combination of orthonormal basis states, with the coefficients representing the probability amplitudes of each state
  • The Fourier coefficients provide a way to analyze and manipulate vectors in terms of their projections onto orthogonal directions, which can reveal important properties and relationships

Key Terms to Review (17)

⟨u, v⟩: The notation ⟨u, v⟩ represents the inner product or dot product of two vectors u and v in a vector space. This operation measures the degree of similarity and orthogonality between the vectors, producing a scalar value that reflects their relationship. A key aspect of this concept is that when two vectors are orthogonal, their inner product is zero, indicating that they are at right angles to each other in the space.
||v||: The notation ||v|| represents the norm or length of a vector v in a vector space. This concept measures the distance from the origin to the point represented by the vector, providing a way to quantify the size or magnitude of the vector. Understanding this term is essential for grasping concepts related to orthogonality and the properties of orthonormal bases, as norms are fundamental in defining angles and distances between vectors.
Bessel's Inequality: Bessel's Inequality is a fundamental result in the theory of inner product spaces that provides an important bound on the coefficients when expressing a vector in terms of an orthonormal basis. Specifically, it states that for any vector in an inner product space, the sum of the squares of the coefficients corresponding to its projections onto an orthonormal basis does not exceed the square of the norm of the vector itself. This inequality emphasizes the significance of orthonormal bases and helps establish their utility in representing vectors within these spaces.
Dot product: The dot product is a mathematical operation that takes two equal-length sequences of numbers (usually coordinate vectors) and returns a single number. It is calculated as the sum of the products of the corresponding entries of the two sequences. This operation is fundamental in defining the concept of inner products, which possess several important properties such as linearity, symmetry, and positive definiteness, and also plays a key role in determining orthogonality and the formation of orthonormal bases.
Euclidean Space: Euclidean space is a mathematical construct that provides a framework for understanding geometric relationships in two or more dimensions, characterized by the familiar concepts of points, lines, and planes. It serves as the foundation for vector spaces, allowing us to perform operations such as addition and scalar multiplication while maintaining the essential geometric properties. This space is integral to understanding linear combinations, independence, finite dimensions, and concepts of orthogonality, forming a cornerstone in many areas of mathematics.
Gram-Schmidt Process: The Gram-Schmidt Process is a method for orthonormalizing a set of vectors in an inner product space, creating an orthogonal or orthonormal basis from a linearly independent set of vectors. This process is essential for simplifying problems in linear algebra, especially when dealing with orthogonality, orthogonal matrices, and decompositions like QR decomposition.
Henri Poincaré: Henri Poincaré was a French mathematician, theoretical physicist, and philosopher of science known for his foundational work in topology and dynamical systems. His contributions significantly advanced the understanding of geometric concepts, especially regarding orthogonality and orthonormal bases, which are essential for working in vector spaces and linear algebra.
Inner Product Space: An inner product space is a vector space equipped with an inner product, which is a mathematical operation that takes two vectors and returns a scalar, satisfying specific properties like positivity, linearity, and symmetry. This concept connects to various essential aspects such as the measurement of angles and lengths in the space, which leads to discussions on orthogonality, bases, and projections that are critical in advanced linear algebra.
John von Neumann: John von Neumann was a Hungarian-American mathematician, physicist, and computer scientist, renowned for his foundational contributions to various fields including game theory, quantum mechanics, and the development of digital computers. His work laid the groundwork for modern computing and has strong implications in areas like optimization and linear programming.
Least Squares Approximation: Least squares approximation is a mathematical method used to find the best-fitting line or curve for a set of data points by minimizing the sum of the squares of the differences between the observed values and the values predicted by the model. This technique relies on inner product spaces to determine distances, utilizes orthogonal projections to compute the closest approximation in a linear sense, and can be enhanced using processes like Gram-Schmidt for orthonormal bases, ultimately facilitating efficient QR decomposition for solving systems of equations.
Linear Independence: Linear independence refers to a set of vectors in a vector space that cannot be expressed as a linear combination of each other. This concept is crucial for understanding the structure of vector spaces, as it indicates how vectors can span a space without redundancy, leading to an understanding of dimensions, bases, and orthogonality.
Orthonormal Sets: Orthonormal sets are collections of vectors that are both orthogonal and normalized, meaning each vector is perpendicular to every other vector in the set and each vector has a length of one. This property simplifies many operations in linear algebra, such as projections and transformations, allowing for easier computations and clearer geometric interpretations.
Projections: Projections are linear transformations that map a vector onto a subspace, resulting in a new vector that represents the closest point in that subspace. This concept is essential in understanding how vectors can be represented in terms of an orthonormal basis, allowing for easier calculations and simplifications in various mathematical contexts. Projections are often used in applications like least squares fitting and optimization problems, where minimizing distance to a subspace is crucial.
Pythagorean Theorem: The Pythagorean Theorem is a fundamental principle in geometry that establishes a relationship between the lengths of the sides of a right triangle. It states that in a right triangle, the square of the length of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the lengths of the other two sides. This theorem is crucial for understanding concepts like orthogonality, as it provides a way to measure distances in Euclidean space and forms the basis for orthonormal bases.
QR Decomposition: QR decomposition is a method in linear algebra where a matrix is factored into a product of an orthogonal matrix and an upper triangular matrix. This decomposition allows for efficient solutions to linear systems and least squares problems, and connects closely to concepts of orthogonality, orthonormal bases, and the Gram-Schmidt process.
Span: Span is the set of all possible linear combinations of a given set of vectors in a vector space. It helps define the extent to which a set of vectors can cover or represent other vectors within that space, playing a crucial role in understanding subspaces and dimensionality.
Zero Vector: The zero vector is a unique vector in a vector space that has all of its components equal to zero. It serves as the additive identity in vector spaces, meaning that when it is added to any other vector, the result is the other vector itself. The zero vector is crucial in understanding properties like closure and existence of additive inverses within a vector space, as well as in concepts involving orthogonality and the formation of orthonormal bases.
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.