11.5 Matrices and Matrix Operations

3 min readjune 24, 2024

Matrices are powerful tools for organizing and manipulating data in linear algebra. They allow us to represent and solve complex systems of equations efficiently. Understanding operations is key to unlocking their potential in various fields.

, subtraction, and multiplication follow specific rules based on matrix dimensions. These operations enable us to combine and transform data in meaningful ways, making matrices essential in fields like computer graphics, economics, and engineering.

Matrix Operations

Matrix addition and subtraction

Top images from around the web for Matrix addition and subtraction
Top images from around the web for Matrix addition and subtraction
  • Matrices must have the same dimensions to be added or subtracted meaning they need an equal number of rows and columns
  • Add or subtract corresponding elements in the matrices by aligning the rows and columns and performing the operation on each pair of elements
  • Resulting matrix has the same dimensions as the input matrices since each element is added or subtracted individually
  • Example: [1234]+[5678]=[681012]\begin{bmatrix}1 & 2\\3 & 4\end{bmatrix} + \begin{bmatrix}5 & 6\\7 & 8\end{bmatrix} = \begin{bmatrix}6 & 8\\10 & 12\end{bmatrix}
  • of matrices involves multiplying each element in the matrix by a scalar (constant) value
    • Resulting matrix has the same dimensions as the input matrix because each element is multiplied by the same scalar
    • Example: 3×[1234]=[36912]3 \times \begin{bmatrix}1 & 2\\3 & 4\end{bmatrix} = \begin{bmatrix}3 & 6\\9 & 12\end{bmatrix}
  • Properties of matrix addition and include:
    • : A+B=B+AA + B = B + A (order of addition does not matter)
    • : (A+B)+C=A+(B+C)(A + B) + C = A + (B + C) (grouping of additions does not matter)
    • of scalar multiplication over addition: k(A+B)=kA+kBk(A + B) = kA + kB (scalar multiplies each matrix separately)

Conditions for matrix multiplication

  • The number of columns in the first matrix must equal the number of rows in the second matrix for multiplication to be possible
  • Multiply each element in a row of the first matrix by the corresponding element in a column of the second matrix and sum the products to find each element in the resulting matrix
  • Resulting matrix has the same number of rows as the first matrix and the same number of columns as the second matrix
  • Example: [1234]×[5678]=[19224350]\begin{bmatrix}1 & 2\\3 & 4\end{bmatrix} \times \begin{bmatrix}5 & 6\\7 & 8\end{bmatrix} = \begin{bmatrix}19 & 22\\43 & 50\end{bmatrix}
  • Properties of :
    • Not commutative: ABBAAB \neq BA (in general) (order of multiplication matters)
    • Associative: (AB)C=A(BC)(AB)C = A(BC) (grouping of multiplications does not matter)
    • Distributive over addition: A(B+C)=AB+ACA(B + C) = AB + AC (multiplication distributes over addition)
  • is a square matrix with 1s on the main diagonal and 0s elsewhere
    • Multiplying a matrix by its results in the original matrix
    • Example: [1001]\begin{bmatrix}1 & 0\\0 & 1\end{bmatrix} is a 2x2 identity matrix
  • The of a square matrix is the sum of the elements on its main diagonal

Matrices in systems of equations

  • Representing systems of linear equations using matrices:
    • : matrix containing the coefficients of the variables
    • : matrix containing the variables
    • : matrix containing the constants
  • Solving systems of linear equations using matrix operations:
    1. Express the system of equations in matrix form: AX=BAX = B, where AA is the , XX is the variable matrix, and BB is the constant matrix
    2. Multiply both sides by the inverse of the coefficient matrix: A1AX=A1BA^{-1}AX = A^{-1}B
    3. Simplify: X=A1BX = A^{-1}B
  • Finding the inverse of a matrix:
    • A square matrix AA has an inverse A1A^{-1} if AA1=A1A=IAA^{-1} = A^{-1}A = I, where II is the identity matrix
    • Methods for finding the inverse include using the adjugate matrix ( divided by ) and (row reduction to identity matrix)

Matrix Properties and Transformations

  • The of a matrix is the number of rows and columns it contains, often written as m × n
  • The of a matrix is the number of linearly independent rows or columns, which determines the dimension of its column or row space
  • The of a matrix is obtained by interchanging its rows and columns, denoted as A^T
  • Eigenvalues and eigenvectors are special scalars and vectors associated with square matrices:
    • An λ and its corresponding v satisfy the equation Av = λv
    • Eigenvalues and eigenvectors are crucial in understanding matrix transformations and solving systems of differential equations

Key Terms to Review (39)

Associative Property: The associative property is a fundamental mathematical concept that describes the behavior of certain operations, such as addition and multiplication, where the grouping of the operands does not affect the final result. It allows for the rearrangement of the order of operations without changing the outcome.
Associative property of addition: The associative property of addition states that the way in which numbers are grouped when adding does not change their sum. Mathematically, this is expressed as $(a + b) + c = a + (b + c)$.
Augmented matrix: An augmented matrix is a matrix that represents a system of linear equations, including both the coefficients and the constants from the equations. It combines the coefficient matrix and the constant vector into one larger matrix for easier manipulation and solution.
Augmented Matrix: An augmented matrix is a special type of matrix that is used to represent a system of linear equations. It is formed by combining the coefficient matrix of the system with the column of constants on the right-hand side of the equations.
Coefficient matrix: A coefficient matrix is a rectangular array that contains only the coefficients of the variables in a system of linear equations. It is used to facilitate methods such as Gaussian Elimination and finding matrix inverses.
Coefficient Matrix: A coefficient matrix, also known as the coefficient array, is a matrix that contains the coefficients of the variables in a system of linear equations. It is a crucial component in the analysis and solution of systems of linear equations, as it provides a compact and organized representation of the coefficients that define the relationships between the variables.
Cofactor Matrix: The cofactor matrix, also known as the adjoint matrix, is a matrix derived from a given square matrix by replacing each element with its cofactor. The cofactor matrix is closely related to the inverse of a matrix and is a fundamental concept in linear algebra.
Commutative Property: The commutative property is a fundamental mathematical principle that states the order of the operands in an addition or multiplication operation does not affect the result. It allows for the rearrangement of terms without changing the overall value of the expression.
Commutative property of addition: The commutative property of addition states that changing the order of addends does not change the sum. Mathematically, if $a$ and $b$ are real numbers, then $a + b = b + a$.
Constant Matrix: A constant matrix is a matrix in which all the elements are constant, meaning they do not change or vary. It is a special type of matrix that is commonly used in various mathematical and scientific applications, particularly in the context of matrix operations.
Cramer's Rule: Cramer's rule is a method used to solve systems of linear equations by expressing the solution as a ratio of determinants. It provides a systematic way to find the unique solution to a system of linear equations, if it exists, by using the coefficients and constants of the equations.
Determinant: A determinant is a scalar value that can be computed from the elements of a square matrix. It provides important properties about the matrix such as whether it is invertible.
Determinant: The determinant of a square matrix is a scalar value that is a function of the entries of the matrix. It is a fundamental concept in linear algebra that has important applications in the study of systems of linear equations, matrix inversions, and other areas of mathematics.
Dimension: Dimension refers to the number of elements or coordinates required to uniquely specify a point or object within a given space or mathematical structure. It is a fundamental concept in various fields, including linear algebra, geometry, and matrix theory.
Distributive property: The distributive property states that multiplying a sum by a number gives the same result as multiplying each addend by the number and then adding the products. It is expressed as $a(b + c) = ab + ac$.
Distributive Property: The distributive property is a fundamental algebraic rule that allows for the simplification of expressions involving multiplication and addition or subtraction. It states that the product of a number and a sum is equal to the sum of the products of the number with each addend.
Eigenvalue: An eigenvalue is a scalar value that, when multiplied by a vector, results in a scalar multiple of that same vector. Eigenvalues are a fundamental concept in linear algebra and matrix theory, and they have important applications in various fields, including physics, engineering, and computer science.
Eigenvector: An eigenvector of a matrix is a non-zero vector that, when multiplied by the matrix, results in a scalar multiple of itself. Eigenvectors are an important concept in linear algebra and have applications in various fields, including physics, computer science, and data analysis.
Gaussian elimination: Gaussian elimination is a method for solving systems of linear equations. It transforms the system's augmented matrix into row-echelon form using row operations.
Gaussian Elimination: Gaussian elimination is a method for solving systems of linear equations by transforming the system into an equivalent one that is easier to solve. It involves a series of row operations on the augmented matrix of the system to obtain an upper triangular matrix, which can then be used to find the solution to the system.
Identity matrix: An identity matrix is a square matrix with ones on the diagonal and zeros elsewhere. It acts as the multiplicative identity in matrix multiplication, meaning any matrix multiplied by an identity matrix remains unchanged.
Identity Matrix: The identity matrix is a special square matrix where all the elements on the main diagonal are equal to 1, and all the other elements are equal to 0. It serves as the multiplicative identity for matrix multiplication, similar to how the number 1 is the multiplicative identity for scalar multiplication.
Inverse matrix: An inverse matrix is a matrix that, when multiplied by its original matrix, yields the identity matrix. It is denoted as $A^{-1}$ for a given matrix $A$.
Inverse Matrix: An inverse matrix is a special type of matrix that, when multiplied with a given matrix, results in the identity matrix. It represents the opposite or reverse of the original matrix, allowing for the solution of systems of linear equations through matrix operations.
Linear Transformation: A linear transformation is a function that maps vectors in one vector space to vectors in another vector space, while preserving the linear structure of the original space. This means that the transformation must satisfy the properties of linearity, such as preserving vector addition and scalar multiplication.
Matrix: A matrix is a rectangular array of numbers, symbols, or expressions, arranged in rows and columns, that can be used to represent and manipulate mathematical relationships and data. Matrices are fundamental tools in various areas of mathematics, including linear algebra, applied mathematics, and computer science.
Matrix Addition: Matrix addition is a fundamental operation in linear algebra that allows for the combination of two or more matrices of the same size by adding the corresponding elements. This operation is crucial in the context of 11.5 Matrices and Matrix Operations, as it forms the basis for many other matrix manipulations and computations.
Matrix multiplication: Matrix multiplication is an operation that takes two matrices and produces another matrix. It involves multiplying rows of the first matrix by columns of the second matrix and summing the products.
Matrix Multiplication: Matrix multiplication is a fundamental operation in linear algebra that allows for the combination of two matrices to produce a new matrix. It is a crucial concept in the study of matrices and their applications in solving systems of linear equations.
Matrix Subtraction: Matrix subtraction is an arithmetic operation in linear algebra where two matrices of the same size are subtracted element-wise. This process involves subtracting the corresponding elements of the two matrices to create a new matrix with the same dimensions.
Nonsingular: A nonsingular matrix is a square matrix that has an inverse. In other words, it is a matrix that can be inverted, meaning there exists another matrix that, when multiplied with the original matrix, results in the identity matrix.
Rank: The rank of a matrix is the dimension of the vector space generated by its columns or rows. It represents the number of linearly independent columns or rows in the matrix, and is a measure of the matrix's complexity and information content.
Scalar multiplication: Scalar multiplication involves multiplying each entry of a matrix by a constant value, known as the scalar. This operation results in a new matrix where each element is the product of the original element and the scalar.
Scalar Multiplication: Scalar multiplication is the operation of multiplying a vector or matrix by a scalar, which is a single number or quantity. This fundamental operation allows for the scaling or resizing of vectors and matrices, and is a crucial concept in linear algebra and its applications.
Singular: In the context of matrices and systems of linear equations, a singular matrix is a square matrix that does not have an inverse. This means that the determinant of the matrix is zero, and the matrix cannot be used to solve a system of linear equations uniquely.
Symmetric: Symmetric refers to a property where an object or function exhibits a balance or regularity in its form or arrangement. In the context of mathematics, symmetry is a fundamental concept that describes the invariance of an object or function under certain transformations, such as reflection, rotation, or translation.
Trace: In the context of matrices and matrix operations, the trace of a square matrix is the sum of the elements along the main diagonal of the matrix. It represents a fundamental property that provides insights into the characteristics and behavior of a matrix.
Transpose: The transpose of a matrix is a new matrix obtained by interchanging the rows and columns of the original matrix. It is a fundamental operation in linear algebra that allows for the manipulation and analysis of matrices in various mathematical and scientific contexts.
Variable Matrix: A variable matrix is a mathematical construct that represents a collection of variables arranged in rows and columns, where the values within the matrix can change or vary. It is a fundamental concept in the study of matrices and their operations.
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.