Linear transformations are functions between vector spaces that preserve vector addition and scalar multiplication. They're crucial in linear algebra, allowing us to understand how vectors change under different operations and providing a link between abstract vector spaces and concrete matrices.

Matrices can represent linear transformations, making complex operations easier to visualize and compute. This connection between transformations and matrices is key to solving many problems in linear algebra, from finding eigenvalues to understanding subspaces like kernels and ranges.

Linear transformations and their properties

Definition and key properties

Top images from around the web for Definition and key properties
Top images from around the web for Definition and key properties
  • A is a function TT from a vector space VV to a vector space WW that preserves vector addition and scalar multiplication
  • For any vectors uu and vv in VV and any scalar cc, a linear transformation TT satisfies:
    • T(u+v)=T(u)+T(v)T(u + v) = T(u) + T(v) (preserves vector addition)
    • T(cu)=cT(u)T(cu) = cT(u) (preserves scalar multiplication)
  • The in VV is always mapped to the zero vector in WW under a linear transformation
    • This follows from the properties of linear transformations: T(0V)=T(0u)=0T(u)=0WT(0_V) = T(0 \cdot u) = 0 \cdot T(u) = 0_W
  • Linear transformations preserve linear combinations of vectors
    • If v1,v2,,vnv_1, v_2, \ldots, v_n are vectors in VV and c1,c2,,cnc_1, c_2, \ldots, c_n are scalars, then T(c1v1+c2v2++cnvn)=c1T(v1)+c2T(v2)++cnT(vn)T(c_1v_1 + c_2v_2 + \ldots + c_nv_n) = c_1T(v_1) + c_2T(v_2) + \ldots + c_nT(v_n)

Composition of linear transformations

  • The composition of two linear transformations is also a linear transformation
    • If T:UVT: U \to V and S:VWS: V \to W are linear transformations, then their composition ST:UWS \circ T: U \to W defined by (ST)(u)=S(T(u))(S \circ T)(u) = S(T(u)) is also a linear transformation
  • The is associative: (TS)R=T(SR)(T \circ S) \circ R = T \circ (S \circ R)
  • The IV:VVI_V: V \to V defined by IV(v)=vI_V(v) = v for all vv in VV serves as the identity element for composition

Matrix representation of linear transformations

Representing linear transformations using matrices

  • A linear transformation TT from an nn-dimensional vector space VV to an mm-dimensional vector space WW can be represented by an m×nm \times n matrix AA
  • The columns of the matrix AA are the images of the of VV under the linear transformation TT
    • If {v1,v2,,vn}\{v_1, v_2, \ldots, v_n\} is a for VV, then the jj-th column of AA is the of T(vj)T(v_j) with respect to the basis of WW
  • To find the of a vector vv under TT, multiply the matrix AA by the coordinate vector of vv with respect to the basis of VV
    • If v=c1v1+c2v2++cnvnv = c_1v_1 + c_2v_2 + \ldots + c_nv_n, then T(v)=A[v]B=c1T(v1)+c2T(v2)++cnT(vn)T(v) = A[v]_B = c_1T(v_1) + c_2T(v_2) + \ldots + c_nT(v_n), where [v]B[v]_B is the coordinate vector of vv with respect to the basis BB of VV

Dependence on choice of bases

  • The of a linear transformation depends on the choice of bases for the domain and codomain
    • Different bases for VV and WW will result in different matrix representations for the same linear transformation TT
  • Changing bases for the domain and codomain leads to of the matrix representation
    • If AA is the matrix of TT with respect to bases BB and CC, and PP and QQ are change of basis matrices from BB to BB' and CC to CC', respectively, then the matrix of TT with respect to bases BB' and CC' is Q1APQ^{-1}AP

Matrices of linear transformations with respect to bases

Computing the matrix of a linear transformation

  • To find the matrix of a linear transformation TT with respect to given bases BB and CC, apply TT to each basis vector in BB and express the result as a linear combination of the basis vectors in CC
  • The coefficients of these linear combinations form the columns of the matrix representation of TT with respect to the bases BB and CC
    • If B={v1,v2,,vn}B = \{v_1, v_2, \ldots, v_n\} is a basis for VV and C={w1,w2,,wm}C = \{w_1, w_2, \ldots, w_m\} is a basis for WW, and T(vj)=a1jw1+a2jw2++amjwmT(v_j) = a_{1j}w_1 + a_{2j}w_2 + \ldots + a_{mj}w_m, then the matrix of TT with respect to BB and CC is A=(aij)A = (a_{ij})

Composition of linear transformations and matrix multiplication

  • The matrix of a composition of linear transformations is the product of their individual matrices, with the order of multiplication determined by the order of composition
    • If T:UVT: U \to V and S:VWS: V \to W are linear transformations with matrices AA and BB with respect to bases of UU, VV, and WW, respectively, then the matrix of STS \circ T with respect to the same bases is BABA
  • is associative and distributive over , mirroring the properties of composition of linear transformations

Kernel and range of linear transformations

Kernel (null space) of a linear transformation

  • The (or ) of a linear transformation TT is the set of all vectors vv in the domain such that T(v)=0T(v) = 0
    • ker(T)={vVT(v)=0}\ker(T) = \{v \in V \mid T(v) = 0\}
  • The kernel is a subspace of the domain and its dimension is called the nullity of the transformation
    • The kernel is closed under vector addition and scalar multiplication, making it a subspace
    • The nullity of TT is denoted by nullity(T)=dim(ker(T))\text{nullity}(T) = \dim(\ker(T))
  • Finding the kernel of a linear transformation represented by a matrix AA is equivalent to solving the homogeneous system of linear equations Ax=0Ax = 0

Range (image) of a linear transformation

  • The (or image) of a linear transformation TT is the set of all vectors ww in the codomain such that w=T(v)w = T(v) for some vector vv in the domain
    • range(T)={wWw=T(v) for some vV}\text{range}(T) = \{w \in W \mid w = T(v) \text{ for some } v \in V\}
  • The range is a subspace of the codomain and its dimension is called the rank of the transformation
    • The range is closed under vector addition and scalar multiplication, making it a subspace
    • The rank of TT is denoted by rank(T)=dim(range(T))\text{rank}(T) = \dim(\text{range}(T))
  • Finding the range of a linear transformation represented by a matrix AA is equivalent to finding the of AA

Rank-nullity theorem

  • The states that for a linear transformation TT from an nn-dimensional vector space to an mm-dimensional vector space, the sum of the rank and nullity of TT is equal to nn
    • rank(T)+nullity(T)=n\text{rank}(T) + \text{nullity}(T) = n, where n=dim(V)n = \dim(V)
  • This theorem relates the dimensions of the kernel and range of a linear transformation to the dimension of its domain
  • The rank-nullity theorem is a powerful tool for understanding the properties of linear transformations and their matrix representations

Key Terms to Review (25)

Basis: A basis is a set of vectors in a vector space that is linearly independent and spans the entire space. This means that any vector in that space can be expressed as a unique linear combination of the basis vectors. The concept of a basis is crucial for understanding how different vector spaces relate to each other, especially in terms of transformations and dimensions.
Basis vectors: Basis vectors are a set of vectors in a vector space that are linearly independent and span the entire space. They serve as the building blocks for representing any vector within that space, allowing for unique representation in terms of their linear combinations. The concept of basis vectors is crucial in understanding linear transformations and matrices, as they help define how these transformations operate on vector spaces.
Column Space: The column space of a matrix is the set of all possible linear combinations of its column vectors. This space is crucial because it reveals the dimensions and properties of linear transformations represented by the matrix, showing how inputs from one vector space can be mapped to outputs in another. Understanding the column space helps in determining the rank of a matrix and its ability to span a given vector space.
Composition of linear transformations: The composition of linear transformations refers to the process of applying one linear transformation after another. When you have two linear transformations, say T and S, the composition, denoted as T ∘ S, takes an input vector, applies S to it, and then applies T to the result. This concept highlights how different transformations can work together, and it connects directly to matrix multiplication, where the matrix representation of these transformations can be multiplied in the same order.
Computer graphics: Computer graphics is the field of visual computing that involves generating, manipulating, and representing images and visual information through computer technology. It plays a crucial role in various applications such as video games, simulations, and visual effects, relying heavily on mathematical concepts including linear transformations and matrices to create realistic visuals and animations.
Coordinate vector: A coordinate vector is a representation of a vector in terms of its components relative to a specific basis in a vector space. This representation allows us to express the vector as a linear combination of the basis vectors, providing a clear understanding of its position and direction in that space. It serves as a crucial link between abstract vectors and their numerical representations, especially when discussing linear transformations and matrices.
Data Science: Data science is an interdisciplinary field that uses scientific methods, processes, algorithms, and systems to extract knowledge and insights from structured and unstructured data. It combines techniques from statistics, mathematics, computer science, and domain expertise to analyze and interpret complex data sets, enabling informed decision-making.
Determinant: The determinant is a scalar value that can be computed from the elements of a square matrix and provides important information about the matrix, such as whether it is invertible and its scaling factor in linear transformations. In the context of linear transformations, the determinant indicates how much a transformation scales area or volume, while in eigenvalues and eigenvectors, it helps determine the nature of solutions to linear equations represented by matrices.
Identity matrix: An identity matrix is a square matrix in which all the elements of the principal diagonal are ones, and all other elements are zeros. This special type of matrix acts as the multiplicative identity in matrix multiplication, meaning that when any matrix is multiplied by an identity matrix of compatible size, the original matrix remains unchanged. Identity matrices are essential in linear transformations, as they represent the transformation that leaves vectors unchanged.
Identity transformation: The identity transformation is a special type of linear transformation that maps every vector in a vector space to itself, effectively leaving the vector unchanged. This transformation serves as the foundational example of linear transformations, illustrating the concept of a mapping that maintains the structure of the vector space without any alteration. It is represented by the identity matrix, which plays a crucial role in understanding matrix operations and properties in linear algebra.
Image: In mathematics, the image refers to the set of all output values that a function or mapping can produce from a given set of input values. It represents how elements from one space are transformed into another, emphasizing the relationship between the input and output. This concept is essential in understanding how different structures interact through mappings, showcasing their inherent properties and behaviors.
Invertibility: Invertibility refers to the property of a matrix or a linear transformation that allows for an inverse to exist. If a matrix is invertible, it means there is another matrix that, when multiplied by the original, results in the identity matrix. This concept is crucial because it indicates that a linear transformation can be undone, providing a means to recover original input from output.
Kernel: The kernel is a fundamental concept that refers to the set of elements in a mathematical structure that map to the identity element under a given operation. In the context of algebraic structures, it serves as a critical tool for understanding homomorphisms and linear transformations, revealing how these mappings behave and what properties are preserved. This makes it essential for studying structures like groups, rings, and vector spaces, where kernels provide insight into the underlying relationships and dimensionality involved in these systems.
Linear transformation: A linear transformation is a function between two vector spaces that preserves the operations of vector addition and scalar multiplication. This means that if you take any two vectors and apply the transformation, the result will be the same as if you transformed each vector separately and then added them together. Linear transformations can be represented using matrices, which helps in understanding their properties and effects on vectors.
Matrix addition: Matrix addition is the operation of adding two matrices by combining their corresponding elements. This operation is fundamental in linear algebra and is essential for understanding linear transformations, as it allows for the manipulation of vectors and transformations in multi-dimensional space.
Matrix multiplication: Matrix multiplication is a binary operation that produces a new matrix from two given matrices by combining their elements according to specific rules. This process involves taking the rows of the first matrix and the columns of the second matrix, performing element-wise multiplication, and summing the results to form the entries of the resulting matrix. This operation is fundamental in understanding linear transformations, as matrices can represent these transformations and their compositions.
Matrix representation: Matrix representation refers to the way in which a linear transformation can be expressed using a matrix. This connection is crucial because it allows us to perform operations on linear transformations using matrix algebra, making it easier to analyze and compute results. By representing transformations in this way, we can simplify calculations and understand the relationships between different spaces and dimensions.
Null space: The null space of a matrix is the set of all vectors that, when multiplied by the matrix, result in the zero vector. This concept is crucial for understanding linear independence and bases, as well as analyzing linear transformations represented by matrices. The null space reveals important properties about the solutions to linear equations, showing which vectors can be mapped to zero and thus indicating dependencies among vectors in a vector space.
Range: The range of a function is the set of all possible output values that result from applying the function to its entire domain. This concept is fundamental because it helps to understand what values can actually be produced by a function and how those outputs relate to the inputs, linking the notion of functions to various types, properties, and transformations.
Rank-nullity theorem: The rank-nullity theorem is a fundamental result in linear algebra that relates the dimensions of the kernel and image of a linear transformation to the dimension of its domain. It states that for any linear transformation from a vector space V to a vector space W, the sum of the rank (the dimension of the image) and the nullity (the dimension of the kernel) equals the dimension of V. This theorem highlights the intrinsic balance between how much information is retained and lost in linear mappings.
Row echelon form: Row echelon form is a specific arrangement of a matrix where all non-zero rows are above any rows of all zeros, and the leading coefficient (the first non-zero number from the left, also called the pivot) of each non-zero row is to the right of the leading coefficient of the previous row. This form is crucial for solving systems of linear equations and is connected to concepts such as linear transformations and matrices since it helps identify the solutions and structure of these systems.
Similarity Transformations: Similarity transformations are geometric operations that preserve the shape of figures while allowing for changes in size. This means that the angles of the figures remain the same, while the sides may be proportionally scaled, thus maintaining their relative dimensions. In the context of linear transformations and matrices, similarity transformations can be represented through matrix operations that demonstrate how one geometric figure can be transformed into another similar figure.
Square matrix: A square matrix is a matrix with the same number of rows and columns, which means its dimensions are n x n for some integer n. This unique property allows square matrices to have special characteristics and operations, such as determinants, eigenvalues, and invertibility, that do not apply to rectangular matrices. Square matrices play a crucial role in linear transformations, providing a way to represent and manipulate geometric transformations in a consistent manner.
Zero matrix: A zero matrix is a matrix in which all of its elements are zero. It plays a significant role in linear algebra, especially in the context of linear transformations and matrices, where it acts as the additive identity. This means that when you add a zero matrix to any other matrix of the same dimensions, the result is the original matrix, making it crucial for understanding operations involving matrices.
Zero Vector: The zero vector is a special vector in a vector space that has all its components equal to zero. It acts as the additive identity, meaning that when it is added to any other vector in that space, it does not change the value of that vector. This concept is essential for understanding linear transformations and matrices, as it helps define the properties of vector spaces and their behavior under various operations.
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.