Self-adjoint operators are linear operators equal to their adjoint, with real eigenvalues and orthogonal eigenspaces. They're crucial in quantum mechanics and data analysis. Their properties make them ideal for representing physical observables and analyzing complex datasets.
Hermitian matrices are the matrix representation of self-adjoint operators in finite-dimensional spaces. They share similar properties, including real eigenvalues and orthogonal eigenvectors. The spectral theorem allows for diagonalization, enabling efficient computation of matrix functions and applications in various fields.
Self-adjoint operators in inner product spaces
Definition and properties
- A self-adjoint operator is a linear operator equal to its adjoint operator
- For a linear operator on an inner product space , is self-adjoint if for all
- Self-adjoint operators are bounded and have real eigenvalues
- The eigenspaces corresponding to distinct eigenvalues are orthogonal
- If is a self-adjoint operator on a finite-dimensional inner product space , there exists an orthonormal basis for consisting of eigenvectors of
Algebraic properties
- The set of self-adjoint operators on an inner product space forms a real vector space under the usual addition and scalar multiplication of operators
- The composition of two self-adjoint operators is self-adjoint if and only if the operators commute
- For self-adjoint operators and , is a necessary and sufficient condition for to be self-adjoint
- The sum of two self-adjoint operators is always self-adjoint
- If and are self-adjoint, then is also self-adjoint
- Scalar multiples of self-adjoint operators are self-adjoint
- If is self-adjoint and , then is also self-adjoint
Eigenvalues and eigenvectors of self-adjoint operators
Eigenvalue properties
- Eigenvalues of a self-adjoint operator are always real
- If is an eigenvalue of a self-adjoint operator , then
- Eigenvectors corresponding to distinct eigenvalues of a self-adjoint operator are orthogonal
- If and are eigenvectors of a self-adjoint operator with distinct eigenvalues and , then
- The algebraic and geometric multiplicities of each eigenvalue of a self-adjoint operator are equal
- For any eigenvalue of a self-adjoint operator , the dimension of the eigenspace corresponding to equals the multiplicity of as a root of the characteristic polynomial of
Spectral properties
- A self-adjoint operator on a finite-dimensional inner product space has a complete set of orthonormal eigenvectors that form a basis for the space
- This set of eigenvectors is called an orthonormal eigenbasis
- Any vector in the inner product space can be expressed as a linear combination of the orthonormal eigenvectors
- For a vector in an inner product space with orthonormal eigenbasis ,
- The eigenvalues of a self-adjoint operator can be used to calculate the operator's trace and determinant
- For a self-adjoint operator with eigenvalues , and

Self-adjoint operators vs Hermitian matrices
Hermitian matrices
- A matrix is Hermitian if , where denotes the conjugate transpose of
- Hermitian matrices are the matrix representation of self-adjoint operators on finite-dimensional inner product spaces
- The eigenvalues of a Hermitian matrix are always real, and the eigenvectors corresponding to distinct eigenvalues are orthogonal
- Every Hermitian matrix is unitarily diagonalizable
- There exists a unitary matrix such that is a diagonal matrix with the eigenvalues of on the diagonal
Algebraic properties
- The set of Hermitian matrices forms a real vector space under the usual matrix addition and scalar multiplication
- The product of two Hermitian matrices is Hermitian if and only if the matrices commute
- For Hermitian matrices and , is a necessary and sufficient condition for to be Hermitian
- The sum of two Hermitian matrices is always Hermitian
- If and are Hermitian, then is also Hermitian
- Scalar multiples of Hermitian matrices are Hermitian
- If is Hermitian and , then is also Hermitian
Spectral theorem for self-adjoint operators
Diagonalization of Hermitian matrices
- The spectral theorem states that if is a self-adjoint operator on a finite-dimensional inner product space , then there exists an orthonormal basis for consisting of eigenvectors of , and can be represented as a diagonal matrix with respect to this basis
- To diagonalize a Hermitian matrix , find an orthonormal basis of eigenvectors and form a unitary matrix with these eigenvectors as columns
- Then, , where is a diagonal matrix with the eigenvalues of on the diagonal
- The spectral decomposition of a Hermitian matrix is given by , where is a unitary matrix whose columns are eigenvectors of , and is a diagonal matrix with the eigenvalues of on the diagonal
Applications of the spectral theorem
- The spectral theorem allows for the computation of matrix functions of Hermitian matrices
- If is a function defined on the eigenvalues of a Hermitian matrix , then , where is the diagonal matrix obtained by applying to each diagonal entry of
- The spectral theorem is used in quantum mechanics to represent observables as self-adjoint operators and to calculate their expectation values and probabilities
- The eigenvalues of the observable correspond to the possible measurement outcomes, and the eigenvectors represent the states in which the system is found after the measurement
- The spectral theorem is also applied in signal processing and data analysis to perform principal component analysis (PCA) and singular value decomposition (SVD)
- These techniques help in dimensionality reduction, feature extraction, and noise reduction by identifying the most significant eigenvectors and eigenvalues of the data covariance matrix