Fiveable

🧚🏽‍♀️Abstract Linear Algebra I Unit 8 Review

QR code for Abstract Linear Algebra I practice questions

8.4 Orthogonal Matrices and Their Properties

8.4 Orthogonal Matrices and Their Properties

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025
🧚🏽‍♀️Abstract Linear Algebra I
Unit & Topic Study Guides

Orthogonal matrices are special square matrices with columns that form an orthonormal set. They're key players in linear algebra, popping up in rotations, reflections, and other transformations that keep distances and angles intact.

These matrices have cool properties like their transpose being their inverse. They're super useful in simplifying calculations, finding orthonormal bases, and breaking down matrices in important ways like QR decomposition and SVD.

Orthogonal Matrices

Definition and Properties

  • An orthogonal matrix is a square matrix QQ such that its transpose is equal to its inverse (QT=Q1Q^T = Q^{-1})
  • The columns and rows of an orthogonal matrix form orthonormal sets
    • They are unit vectors (length 1) that are orthogonal (perpendicular) to each other
  • The determinant of an orthogonal matrix is either 1 or -1
  • The product of two orthogonal matrices is also an orthogonal matrix
  • Orthogonal matrices preserve the length of vectors and the angle between vectors under transformation
    • Rotations and reflections are examples of transformations that can be represented by orthogonal matrices

Examples

  • The 2x2 rotation matrix [cosθsinθsinθcosθ]\begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} is orthogonal for any angle θ\theta
  • The 2x2 reflection matrix [cosθsinθsinθcosθ]\begin{bmatrix} \cos\theta & \sin\theta \\ \sin\theta & -\cos\theta \end{bmatrix} is orthogonal for any angle θ\theta

Identifying Orthogonal Matrices

Definition and Properties, linear algebra - Understanding rotation matrices - Mathematics Stack Exchange

Checking Orthogonality

  • To check if a matrix QQ is orthogonal, multiply it by its transpose (QTQ^T) and see if the result is the identity matrix (QQT=QTQ=IQQ^T = Q^TQ = I)
    • If QQT=QTQ=IQQ^T = Q^TQ = I, then QQ is orthogonal
  • Verify that the columns and rows of the matrix form orthonormal sets
    • Check if they are unit vectors (length 1) and if they are orthogonal (dot product of distinct vectors is 0)
  • Calculate the determinant of the matrix and confirm that it is either 1 or -1

Examples

  • The matrix [12121212]\begin{bmatrix} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \end{bmatrix} is orthogonal because QQT=QTQ=IQQ^T = Q^TQ = I, its columns and rows form orthonormal sets, and its determinant is 1
  • The matrix [1001]\begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix} is orthogonal because QQT=QTQ=IQQ^T = Q^TQ = I, its columns and rows form orthonormal sets, and its determinant is -1

Orthonormal Basis of Orthogonal Matrices

Definition and Properties, vector spaces - Orthonormal Sets and the Gram-Schmidt Procedure - Mathematics Stack Exchange

Proving Orthonormality

  • Let QQ be an orthogonal matrix with columns q1,q2,...,qnq_1, q_2, ..., q_n
  • Show that the columns are unit vectors by proving that the dot product of each column with itself is equal to 1 (qiqi=1q_i \cdot q_i = 1 for all ii)
  • Prove that the columns are orthogonal to each other by showing that the dot product of any two distinct columns is equal to 0 (qiqj=0q_i \cdot q_j = 0 for all iji \neq j)
  • Conclude that the columns of QQ form an orthonormal set, which is a basis for the vector space since QQ is a square matrix

Examples

  • For the orthogonal matrix [12121212]\begin{bmatrix} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \end{bmatrix}, its columns q1=[1212]q_1 = \begin{bmatrix} \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} \end{bmatrix} and q2=[1212]q_2 = \begin{bmatrix} -\frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} \end{bmatrix} form an orthonormal basis for R2\mathbb{R}^2
  • The standard basis vectors e1=[10]e_1 = \begin{bmatrix} 1 \\ 0 \end{bmatrix} and e2=[01]e_2 = \begin{bmatrix} 0 \\ 1 \end{bmatrix} form an orthonormal basis for R2\mathbb{R}^2 and can be seen as the columns of the orthogonal identity matrix I2I_2

Applications of Orthogonal Matrices

Transformations and Decompositions

  • Use orthogonal matrices to perform rotations, reflections, and other isometric transformations in Euclidean spaces
    • Isometric transformations preserve distances and angles between vectors
  • Apply orthogonal matrices to solve systems of linear equations by simplifying the problem through orthogonal transformations
  • Utilize orthogonal matrices in the QR decomposition (factorization) of a matrix
    • A=QRA = QR, where QQ is an orthogonal matrix and RR is an upper triangular matrix
  • Employ orthogonal matrices in the singular value decomposition (SVD) of a matrix
    • A=UΣVTA = U\Sigma V^T, where UU and VV are orthogonal matrices and Σ\Sigma is a diagonal matrix of singular values

Orthonormal Bases for Subspaces

  • Use orthogonal matrices to find orthonormal bases for subspaces, such as the column space and null space of a matrix
    • The columns of an orthogonal matrix can form an orthonormal basis for a subspace
  • Orthonormal bases simplify calculations and provide a convenient representation for vectors in a subspace
  • Gram-Schmidt process can be used to construct an orthonormal basis from a given set of linearly independent vectors