unit 2 review
Linear transformations are fundamental concepts in linear algebra, mapping vectors between spaces while preserving addition and scalar multiplication. They're essential for understanding how vectors behave under different operations and are widely used in various fields.
These transformations can be represented by matrices, making calculations more efficient. Key properties include linearity, kernel, range, and rank-nullity theorem. Common types include rotations, reflections, and projections, with applications in computer graphics, quantum mechanics, and machine learning.
- Linear transformations map vectors from one vector space to another while preserving vector addition and scalar multiplication
- Denoted as $T: V \rightarrow W$, where $V$ and $W$ are vector spaces and $T$ is the linear transformation
- For any vectors $\vec{u}, \vec{v} \in V$ and scalar $c$, a linear transformation satisfies:
- Additivity: $T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v})$
- Homogeneity: $T(c\vec{u}) = cT(\vec{u})$
- Linear transformations can be represented using matrices, with the matrix acting on the input vector to produce the output vector
- Examples of linear transformations include rotations, reflections, and projections in 2D or 3D space
- Linear transformations preserve the origin, meaning $T(\vec{0}) = \vec{0}$
- Geometrically, linear transformations maintain the relative positions of vectors and the straightness of lines
- Linearity is the defining property of linear transformations, consisting of additivity and homogeneity
- Linear transformations are uniquely determined by their action on basis vectors
- If ${\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n}$ is a basis for $V$, then $T$ is completely determined by $T(\vec{v}_1), T(\vec{v}_2), \ldots, T(\vec{v}_n)$
- The kernel (or null space) of a linear transformation $T$ is the set of all vectors $\vec{v} \in V$ such that $T(\vec{v}) = \vec{0}$
- Denoted as $\ker(T)$ or $\text{null}(T)$
- The range (or image) of a linear transformation $T$ is the set of all vectors $\vec{w} \in W$ such that $\vec{w} = T(\vec{v})$ for some $\vec{v} \in V$
- Denoted as $\text{range}(T)$ or $\text{im}(T)$
- The rank of a linear transformation is the dimension of its range
- The nullity of a linear transformation is the dimension of its kernel
- The rank-nullity theorem states that for a linear transformation $T: V \rightarrow W$, $\dim(V) = \text{rank}(T) + \text{nullity}(T)$
- Linear transformations can be represented using matrices, with the matrix acting on the input vector to produce the output vector
- If $T: \mathbb{R}^n \rightarrow \mathbb{R}^m$ is a linear transformation and ${\vec{e}_1, \vec{e}_2, \ldots, \vec{e}_n}$ is the standard basis for $\mathbb{R}^n$, then the matrix representation of $T$ is:
- $A = [T(\vec{e}_1) \quad T(\vec{e}_2) \quad \ldots \quad T(\vec{e}_n)]$
- The columns of the matrix $A$ are the images of the basis vectors under the linear transformation $T$
- For any vector $\vec{v} \in \mathbb{R}^n$, $T(\vec{v}) = A\vec{v}$, where $A\vec{v}$ represents matrix-vector multiplication
- The matrix representation depends on the choice of bases for the domain and codomain vector spaces
- Changing the bases results in a different matrix representation for the same linear transformation
- The matrix representation allows for efficient computation and analysis of linear transformations using matrix algebra
- Rotation: A linear transformation that rotates vectors by a specified angle around the origin
- In 2D, a rotation by angle $\theta$ is represented by the matrix $\begin{bmatrix} \cos\theta & -\sin\theta \ \sin\theta & \cos\theta \end{bmatrix}$
- Reflection: A linear transformation that reflects vectors across a line or plane passing through the origin
- In 2D, a reflection across the x-axis is represented by the matrix $\begin{bmatrix} 1 & 0 \ 0 & -1 \end{bmatrix}$
- Projection: A linear transformation that projects vectors onto a specified subspace
- In 2D, a projection onto the x-axis is represented by the matrix $\begin{bmatrix} 1 & 0 \ 0 & 0 \end{bmatrix}$
- Scaling: A linear transformation that stretches or compresses vectors by a specified factor along each coordinate axis
- In 2D, scaling by factors $a$ and $b$ is represented by the matrix $\begin{bmatrix} a & 0 \ 0 & b \end{bmatrix}$
- Shear: A linear transformation that shifts vectors parallel to a coordinate axis by an amount proportional to their coordinate along another axis
- In 2D, a horizontal shear by factor $k$ is represented by the matrix $\begin{bmatrix} 1 & k \ 0 & 1 \end{bmatrix}$
- Identity transformation: A linear transformation that maps every vector to itself, represented by the identity matrix
- Composition of linear transformations is the process of applying one linear transformation followed by another
- If $T: V \rightarrow W$ and $S: W \rightarrow U$ are linear transformations, their composition $S \circ T: V \rightarrow U$ is defined as $(S \circ T)(\vec{v}) = S(T(\vec{v}))$ for all $\vec{v} \in V$
- The composition of linear transformations is associative: $(T \circ S) \circ R = T \circ (S \circ R)$
- The matrix representation of the composition of linear transformations is the product of their individual matrix representations
- If $A$ and $B$ are the matrix representations of $T$ and $S$, respectively, then the matrix representation of $S \circ T$ is $BA$
- The inverse of a linear transformation $T: V \rightarrow V$ is a linear transformation $T^{-1}: V \rightarrow V$ such that $T \circ T^{-1} = T^{-1} \circ T = I$, where $I$ is the identity transformation
- A linear transformation is invertible (or nonsingular) if and only if it is bijective (one-to-one and onto)
- The matrix representation of the inverse of a linear transformation is the inverse of its matrix representation
- If $A$ is the matrix representation of $T$, then $A^{-1}$ is the matrix representation of $T^{-1}$, provided $A$ is invertible
- An eigenvector of a linear transformation $T: V \rightarrow V$ is a nonzero vector $\vec{v} \in V$ such that $T(\vec{v}) = \lambda\vec{v}$ for some scalar $\lambda$
- The scalar $\lambda$ is called the eigenvalue corresponding to the eigenvector $\vec{v}$
- Eigenvectors are vectors that, when acted upon by a linear transformation, only change by a scalar factor (the eigenvalue)
- To find the eigenvalues of a linear transformation, solve the characteristic equation $\det(A - \lambda I) = 0$, where $A$ is the matrix representation of $T$ and $I$ is the identity matrix
- For each eigenvalue $\lambda$, find the corresponding eigenvectors by solving the equation $(A - \lambda I)\vec{v} = \vec{0}$
- Eigenvectors corresponding to distinct eigenvalues are linearly independent
- The set of all eigenvectors corresponding to an eigenvalue, along with the zero vector, forms an eigenspace
- Eigenvalues and eigenvectors have numerous applications, such as in matrix diagonalization, systems of differential equations, and principal component analysis
- Computer graphics: Linear transformations are used to manipulate and transform 2D and 3D objects, such as in video games and animations
- Rotations, reflections, scaling, and shearing are common linear transformations used in computer graphics
- Image processing: Linear transformations are used to apply filters, enhance features, and perform image compression
- Examples include edge detection, blurring, and color space conversions
- Quantum mechanics: Linear transformations are used to describe the evolution of quantum states and the action of quantum operators
- Unitary transformations, which preserve inner products, are particularly important in quantum mechanics
- Cryptography: Linear transformations are used in various encryption and decryption algorithms
- The Hill cipher, for example, uses matrix multiplication to encrypt and decrypt messages
- Machine learning: Linear transformations are used in feature extraction, dimensionality reduction, and data preprocessing
- Principal component analysis (PCA) and linear discriminant analysis (LDA) are examples of linear transformation techniques used in machine learning
- Robotics: Linear transformations are used to describe the motion and orientation of robotic arms and manipulators
- Homogeneous coordinates and transformation matrices are commonly used in robotics to represent translations and rotations in 3D space
Practice Problems and Examples
- Given the linear transformation $T: \mathbb{R}^2 \rightarrow \mathbb{R}^2$ defined by $T(x, y) = (2x - y, x + y)$, find the matrix representation of $T$ with respect to the standard basis.
- Determine whether the transformation $T: \mathbb{R}^3 \rightarrow \mathbb{R}^3$ defined by $T(x, y, z) = (x + y, y + z, z)$ is linear. If it is linear, find its kernel and range.
- Given the matrix $A = \begin{bmatrix} 1 & 2 \ -1 & 3 \end{bmatrix}$, find the linear transformation $T: \mathbb{R}^2 \rightarrow \mathbb{R}^2$ represented by $A$.
- Consider the linear transformations $S(x, y) = (x - y, x + y)$ and $T(x, y) = (3x, -2y)$. Find the matrix representations of $S$ and $T$, and then find the matrix representation of the composition $T \circ S$.
- Find the eigenvalues and corresponding eigenvectors of the linear transformation represented by the matrix $A = \begin{bmatrix} 4 & -2 \ 1 & 3 \end{bmatrix}$.
- A linear transformation $T: \mathbb{R}^3 \rightarrow \mathbb{R}^3$ is defined by $T(1, 0, 0) = (2, 1, 1)$, $T(0, 1, 0) = (1, 2, 1)$, and $T(0, 0, 1) = (1, 1, 2)$. Find the matrix representation of $T$ and determine whether $T$ is invertible. If it is invertible, find the matrix representation of $T^{-1}$.