Fiveable
Fiveable
Fiveable
Fiveable

Singular value decomposition (SVD) is a powerful matrix factorization technique that extends eigendecomposition to any matrix. It breaks down matrices into simpler components, revealing crucial structural information and enabling various applications in data analysis and engineering.

SVD connects to spectral theory by generalizing the spectral theorem to non-square and non-normal matrices. This versatility makes SVD a fundamental tool for understanding matrix properties, solving complex problems, and uncovering hidden patterns in data across diverse fields.

Singular Value Decomposition of Matrices

Definition and Structure

Top images from around the web for Definition and Structure
Top images from around the web for Definition and Structure
  • Singular value decomposition (SVD) factorizes real or complex matrices, generalizing eigendecomposition to any m × n matrix
  • For m × n matrix A, orthogonal matrices U and V and diagonal matrix Σ exist such that A = UΣV^T
    • U dimensions m × m
    • Σ dimensions m × n
    • V^T dimensions n × n
  • Diagonal entries σ_i of Σ called singular values, arranged in descending order
  • Columns of U termed left singular vectors
  • Columns of V termed right singular vectors
  • SVD exists for all matrices (square, rectangular, invertible, non-invertible)
  • Number of non-zero singular values equals matrix A's rank
  • SVD decomposes matrix into simpler matrices, revealing structural information (low-rank approximations, matrix norms)

Mathematical Properties

  • SVD always exists, providing a universal matrix decomposition method
  • Uniqueness of SVD components
    • Singular values are unique
    • Singular vectors may have sign ambiguity for non-zero singular values
    • Singular vectors corresponding to repeated singular values may form an orthonormal basis
  • Relationship to other matrix properties
    • Frobenius norm: AF=i=1min(m,n)σi2\|A\|_F = \sqrt{\sum_{i=1}^{\min(m,n)} \sigma_i^2}
    • Nuclear norm: A=i=1min(m,n)σi\|A\|_* = \sum_{i=1}^{\min(m,n)} \sigma_i
    • Spectral norm: A2=σ1\|A\|_2 = \sigma_1 (largest singular value)
  • Connection to matrix rank
    • Rank(A) = number of non-zero singular values
    • Low-rank approximation obtained by truncating smaller singular values

Components of the SVD

Computation and Interpretation

  • Calculate SVD by finding eigenvectors and eigenvalues of A^T A and AA^T
  • Right singular vectors (V columns) derived from A^T A eigenvectors
  • Left singular vectors (U columns) derived from AA^T eigenvectors
  • Singular values σ_i calculated as square roots of non-zero eigenvalues of both A^T A and AA^T
  • Construct Σ by placing singular values on diagonal in descending order
  • Non-zero singular values determine matrix's effective rank
  • Interpret U as rotation or reflection in domain space (input space)
  • Interpret V^T as rotation or reflection in range space (output space)
  • Singular values in Σ represent scaling factors between transformed unit vectors

Geometric Interpretation

  • SVD transforms unit sphere in input space to ellipsoid in output space
  • Principal axes of ellipsoid align with right singular vectors
  • Lengths of principal axes correspond to singular values
  • Left singular vectors indicate orientation of transformed ellipsoid
  • Visualize SVD transformation
    • Start with unit circle (2D) or sphere (3D)
    • Apply V^T rotation
    • Scale by singular values
    • Apply U rotation
  • Null space and range relationships
    • Right singular vectors with zero singular values span null space of A
    • Left singular vectors with non-zero singular values span range of A

Applications of the SVD

Data Analysis and Dimensionality Reduction

  • Solve least-squares problems for non-invertible or overdetermined systems (Ax = b)
  • Compute pseudoinverse A^+ using SVD: A^+ = VΣ^+ U^T
    • Σ^+ obtained by reciprocating non-zero singular values
  • Perform dimensionality reduction by truncating decomposition
    • Retain most significant singular values and vectors
    • Approximate original matrix with lower-rank representation
  • Conduct Principal Component Analysis (PCA) using right singular vectors and singular values
    • Identify principal directions of variation in data
    • Project data onto lower-dimensional subspace
  • Apply to image compression
    • Discard less significant singular values and corresponding vectors
    • Reduce data size while preserving important features
  • Utilize in information retrieval and natural language processing
    • Perform latent semantic analysis
    • Uncover hidden relationships in data (topic modeling, document clustering)
  • Calculate matrix condition number
    • Ratio of largest to smallest singular value
    • Measures sensitivity to numerical operations

Engineering and Scientific Applications

  • Signal processing
    • Filter noise from signals using truncated SVD
    • Separate mixed signals (blind source separation)
  • Control systems
    • Analyze system stability and controllability
    • Design optimal controllers using SVD-based methods
  • Computer vision
    • Perform facial recognition using eigenfaces (SVD of image data)
    • Estimate camera motion and 3D scene structure (structure from motion)
  • Bioinformatics
    • Analyze gene expression data
    • Identify patterns in protein structures
  • Financial modeling
    • Perform portfolio optimization
    • Analyze risk factors in financial data

SVD vs Spectral Theorem

Similarities and Differences

  • Spectral theorem applies to normal matrices (A^* A = AA^* )
  • SVD generalizes spectral theorem concept to any matrix
  • For normal matrices, SVD and eigendecomposition coincide
    • Singular values equal absolute values of eigenvalues
    • Left and right singular vectors identical, equal to matrix eigenvectors
  • Hermitian (self-adjoint) matrices
    • Singular values equal absolute values of eigenvalues
    • Singular vectors are eigenvectors
  • SVD generalizes spectral theorem to rectangular and non-normal square matrices
  • Spectral theorem decomposes matrix based on eigenvector action
  • SVD decomposes matrix based on action on entire vector space

Practical Implications

  • SVD provides more general matrix analysis tool
    • Applicable to wider range of matrices
    • Useful for non-square and non-normal matrices
  • SVD reveals information about matrix range and null space
    • Not directly apparent from eigendecomposition
  • Computational considerations
    • SVD generally more computationally expensive than eigendecomposition
    • Efficient algorithms exist for large, sparse matrices
  • Numerical stability
    • SVD typically more numerically stable than eigendecomposition
    • Useful for ill-conditioned matrices
  • Applications in different fields
    • Quantum mechanics: Spectral theorem for Hermitian operators
    • Data science: SVD for dimensionality reduction and feature extraction
© 2025 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2025 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2025 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary