Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Linear transformations are the workhorses of data science—every time you rotate an image, reduce dimensions with PCA, or normalize features, you're applying a linear transformation. Understanding these operations means understanding how data moves, stretches, and projects through the mathematical spaces that power machine learning algorithms. You're being tested on your ability to recognize what each transformation does geometrically, how its matrix structure produces that effect, and when to apply it in practice.
Don't just memorize the matrix formulas. Know what concept each transformation illustrates: Does it preserve distances? Change orientation? Reduce dimensionality? The difference between a rotation and a reflection might seem subtle on paper, but understanding their geometric and algebraic properties will help you debug algorithms, interpret results, and ace exam questions that ask you to identify or construct the right transformation for a given task.
These transformations maintain distances between points—objects keep their size and shape, just repositioned. Isometric transformations have orthogonal matrices where .
Compare: Rotation vs. Reflection—both preserve distances and angles, but rotations have determinant +1 (preserve orientation) while reflections have determinant -1 (reverse orientation). If asked to classify an isometry, check the determinant first.
These transformations stretch, compress, or independently rescale coordinates. The eigenvalues of these matrices directly correspond to the scaling factors along principal directions.
Compare: Scaling vs. Diagonal matrices—scaling transformations are diagonal matrices, but "diagonal matrix" is the broader algebraic category. When discussing geometry, say "scaling"; when discussing computation or eigendecomposition, say "diagonal."
These transformations change angles or collapse dimensions—shapes don't stay congruent. Shears preserve area but not angles; projections reduce rank and lose information.
Compare: Shear vs. Projection—shears are invertible (determinant ≠ 0) and preserve dimensionality, while projections are typically singular and reduce rank. If an FRQ asks about dimensionality reduction, projection is your answer; if it asks about invertible distortion, think shear.
These handle special cases: doing nothing, collapsing everything, or incorporating translation. Understanding these edge cases clarifies what "linear transformation" really means.
Compare: Identity vs. Zero transformation—both are trivial but opposite extremes. Identity preserves everything (full rank, all eigenvalues = 1); zero destroys everything (rank 0, all eigenvalues = 0). These bookend the spectrum of linear transformations.
| Concept | Best Examples |
|---|---|
| Distance-preserving (isometric) | Rotation, Reflection, Permutation |
| Scaling/stretching | Scaling transformation, Diagonal matrix |
| Shape distortion | Shear transformation |
| Dimensionality reduction | Projection matrix |
| Determinant = 1 (area-preserving) | Rotation, Shear |
| Determinant = -1 (orientation-reversing) | Reflection |
| Idempotent () | Projection matrix, Identity |
| Non-linear made linear | Translation (via homogeneous coordinates) |
Which two transformations are both isometric but differ in their determinant sign, and what does that sign indicate geometrically?
You apply a transformation twice and get the same result as applying it once. Which transformations have this property, and what is it called?
Compare and contrast scaling transformations and shear transformations: which preserves angles? Which preserves area? Which is always invertible?
If you need to combine rotation, scaling, and translation into a single matrix multiplication, what technique must you use and why?
A transformation has eigenvalues all equal to 1 but is not the identity matrix. Which transformation from this guide fits that description, and what does it do geometrically?