Matrices and linear transformations are key concepts in linear algebra, crucial for understanding coding theory. They provide a powerful framework for representing and manipulating data, allowing us to solve complex problems efficiently.
Eigenvalues and eigenvectors are essential tools for analyzing matrices and their properties. They help us understand how matrices transform space and are widely used in various applications, including data compression and error correction in coding theory.
Matrix Fundamentals
Matrix Basics
- A matrix is a rectangular array of numbers arranged in rows and columns
- Denoted by a capital letter such as or
- Elements of a matrix are denoted by lowercase letters with subscripts indicating their position ( represents the element in the -th row and -th column)
- The size of a matrix is described by the number of rows and columns it contains ( matrix has rows and columns)
- The main diagonal of a square matrix consists of elements where the row and column indices are equal ()
- A matrix is symmetric if it is equal to its transpose ()
- The transpose of a matrix is obtained by interchanging its rows and columns
Matrix Operations
- Matrix multiplication is a binary operation that produces a matrix from two matrices
- For matrices and to be multiplied, the number of columns in must equal the number of rows in
- The resulting matrix has the same number of rows as and the same number of columns as
- Each element of the product matrix is calculated by multiplying the elements of the -th row of with the corresponding elements of the -th column of and summing the results
- The inverse of a square matrix is denoted as and satisfies the property , where is the identity matrix
- Not all matrices have an inverse; matrices without an inverse are called singular or degenerate
- The inverse of a 2x2 matrix is given by , provided that
- The determinant of a square matrix is a scalar value that provides information about the matrix's properties
- Denoted as or
- For a 2x2 matrix , the determinant is calculated as
- A matrix is invertible if and only if its determinant is non-zero

Linear Transformations
Basics of Linear Transformations
- A linear transformation is a function between two vector spaces and that satisfies the following properties:
- Additivity: for all
- Homogeneity: for all and scalar
- Linear transformations preserve the vector space structure and can be represented by matrices
- For a linear transformation , there exists an matrix such that for all
- The kernel (or null space) of a linear transformation is the set of all vectors in the domain such that

Rank and Nullity
- The rank of a matrix is the dimension of the vector space spanned by its columns (column space)
- Denoted as
- The rank is equal to the maximum number of linearly independent columns or rows in the matrix
- The nullity of a matrix is the dimension of its null space (kernel)
- Denoted as
- The nullity is equal to the number of free variables in the solution of the homogeneous system
- The rank-nullity theorem states that for a linear transformation , the dimension of the domain is equal to the sum of the rank and nullity of
Eigenvalues and Eigenvectors
Eigenvalue Basics
- An eigenvalue of a square matrix is a scalar such that there exists a non-zero vector satisfying
- The vector is called an eigenvector corresponding to the eigenvalue
- Eigenvalues are the roots of the characteristic polynomial
- To find the eigenvalues of a matrix , solve the characteristic equation
- For a 2x2 matrix , the characteristic equation is
- The set of all eigenvalues of a matrix is called its spectrum
Eigenvector Basics
- An eigenvector of a square matrix corresponding to an eigenvalue is a non-zero vector that satisfies
- Eigenvectors are only defined up to a scalar multiple; if is an eigenvector, then so is for any non-zero scalar
- To find the eigenvectors corresponding to an eigenvalue , solve the equation
- The solution set of this equation is the eigenspace corresponding to
- The dimension of the eigenspace is called the geometric multiplicity of the eigenvalue
- Eigenvectors corresponding to distinct eigenvalues are linearly independent
- A matrix is diagonalizable if it has a full set of linearly independent eigenvectors
- A diagonalizable matrix can be written as , where is a diagonal matrix with the eigenvalues on its diagonal, and is a matrix whose columns are the corresponding eigenvectors