Matrix transformations are the bridge between abstract linear algebra and the physical world—they describe how vectors move, stretch, rotate, and project in space. When you're solving systems of differential equations, analyzing stability, or working with change-of-basis problems, you're fundamentally working with transformations. The exam will test whether you understand why a transformation behaves the way it does, not just whether you can multiply matrices.
You're being tested on your ability to recognize transformation types from their matrices, predict geometric effects, compose transformations correctly, and connect eigenvalue analysis to transformation behavior. Don't just memorize the matrix forms—know what each transformation does to the standard basis vectors, how transformations combine, and when a transformation is invertible. Master these concepts, and you'll handle everything from geometric problems to differential equation systems with confidence.
Transformations That Preserve Structure
These transformations maintain key geometric properties like distances, angles, or orientation. Understanding what each transformation preserves versus changes is crucial for exam questions.
Identity Transformation
Leaves all vectors unchanged—the "do nothing" transformation that maps every vector to itself
Represented by the identity matrixI, where Iv=v for all vectors v
Serves as the multiplicative identity for matrix composition; essential baseline for understanding inverse transformations
Rotation
Rotates vectors around the origin by angle θ while preserving distances and angles between vectors
2D rotation matrix: R(θ)=[cosθsinθ−sinθcosθ]—orthogonal with determinant 1
Eigenvalues are complex (e±iθ), which explains why rotation has no real eigenvectors except at θ=0 or π
Reflection
Flips vectors across a line or plane, reversing orientation while preserving distances
Reflection matrix is orthogonal with determinant−1—this sign change indicates orientation reversal
Has eigenvalues1 and −1, corresponding to vectors parallel and perpendicular to the reflection axis
Compare: Rotation vs. Reflection—both are orthogonal transformations (preserve lengths), but rotation has det=1 while reflection has det=−1. If an FRQ asks about orientation-preserving transformations, rotation is your example; for orientation-reversing, use reflection.
Transformations That Change Size or Shape
These transformations alter geometric properties like length, area, or angles. Pay attention to how the determinant relates to area/volume scaling.
Scaling
Stretches or compresses vectors along coordinate axes by factors sx,sy,…
Diagonal scaling matrix: S=[sx00sy]—eigenvalues equal the scaling factors
Determinant equalssx⋅sy, representing the area scale factor; uniform scaling (sx=sy) preserves angles
Shear
Distorts shape by sliding layers parallel to an axis while keeping one direction fixed
Shear matrix example: [10k1] shears horizontally by factor k—parallelograms become slanted
Determinant equals 1 (area preserved), but angles and distances change; eigenvalue is 1 with algebraic multiplicity 2
Compare: Scaling vs. Shear—scaling changes area (unless det=1) and preserves axis directions, while shear preserves area but distorts angles. Both have real eigenvalues, but shear matrices are often defective (not diagonalizable).
Transformations That Reduce Dimension
Projections collapse space onto a subspace, making them essential for least-squares problems and understanding rank.
Projection
Maps vectors onto a subspace (line, plane, etc.) by finding the closest point in that subspace
Projection matrices satisfyP2=P (idempotent)—applying the projection twice gives the same result
Eigenvalues are only 0 and 1; vectors in the target subspace have eigenvalue 1, vectors in the orthogonal complement have eigenvalue 0
Translation (in Homogeneous Coordinates)
Shifts all points by a fixed vector without rotation or scaling—not a linear transformation in standard coordinates
Requires homogeneous coordinates: 100010txty1 embeds translation into matrix multiplication
Essential for computer graphics where combining translation with other transformations requires a unified matrix framework
Compare: Projection vs. Identity—both have eigenvalue 1, but projection also has eigenvalue 0 (for the nullspace). If a matrix satisfies P2=P and P=I, it's a projection onto a proper subspace.
Combining and Analyzing Transformations
These concepts let you work with transformations as a system—composing them, reversing them, and understanding their fundamental behavior.
Composition of Transformations
Multiply matrices right-to-left: T2T1v applies T1 first, then T2—order matters!
Non-commutativity is critical: rotation then scaling ≠ scaling then rotation in general
Determinants multiply: det(AB)=det(A)det(B), so composed area scale factors combine multiplicatively
Inverse Transformations
Reverses the original transformation: T−1T=TT−1=I
Exists only whendet(T)=0—singular matrices destroy information and can't be undone
For orthogonal matrices (rotations, reflections), T−1=TT, making inversion computationally simple
Eigenvalue Decomposition
Factors a diagonalizable matrix as A=PDP−1, where D contains eigenvalues and P contains eigenvectors
Reveals transformation geometry: eigenvalues show scaling along eigenvector directions; complex eigenvalues indicate rotation
Powers become trivial: An=PDnP−1, essential for solving systems like x′=Ax in differential equations
Compare: Composition vs. Eigenvalue Decomposition—composition combines different transformations sequentially, while eigenvalue decomposition breaks one transformation into scaling along special directions. For repeated application of the same transformation, eigenvalue decomposition is far more efficient.
Quick Reference Table
Concept
Best Examples
Distance-preserving (orthogonal)
Rotation, Reflection, Identity
Orientation-preserving
Rotation, Scaling (positive det), Identity
Orientation-reversing
Reflection, Scaling (negative det)
Area-preserving (det=1)
Rotation, Shear
Dimension-reducing
Projection
Requires homogeneous coordinates
Translation
Always diagonalizable
Scaling, Projection, Reflection
May be defective
Shear
Self-Check Questions
Which two transformations are orthogonal (preserve distances), and how can you distinguish them using the determinant?
A matrix satisfies P2=P but P=I. What type of transformation is this, and what are its possible eigenvalues?
Compare and contrast scaling and shear: which preserves area, which preserves axis directions, and how do their eigenvalue structures differ?
If you apply rotation by θ followed by rotation by ϕ, what single transformation results? Does the order matter for rotations about the same axis?
(FRQ-style) Given a transformation matrix with complex eigenvalues λ=a±bi where a2+b2=1, explain geometrically what this transformation does to vectors and why it has no real eigenvectors.