Why This Matters
The null space isn't just an abstract definition to memorize—it's the key to understanding when and why linear transformations lose information. When you're analyzing whether a system has unique solutions, determining if matrix columns are independent, or figuring out how many free variables exist in a solution set, you're working with null space concepts. Every question about injectivity, linear independence, and solution structure connects back to this fundamental idea.
You're being tested on your ability to connect null space to broader themes: the rank-nullity theorem, the relationship between a transformation's kernel and its injectivity, and how the null space reveals the "hidden" dependencies among matrix columns. Don't just memorize that null space is "vectors mapped to zero"—know why that matters for solving systems, how to compute it, and what it tells you about the matrix's structure.
Foundational Definitions
Before diving into applications, you need rock-solid understanding of what null space actually is. The null space captures every vector that gets "crushed" to zero by the matrix transformation.
Definition of Null Space
- The null space of matrix A is the set of all vectors x satisfying Ax=0—this is your starting point for everything else
- Represents all solutions to the homogeneous equation, meaning it describes every possible input that produces zero output
- Forms a subspace of Rn (where n is the number of columns), so it's closed under addition and scalar multiplication
Dimension of Null Space (Nullity)
- Nullity equals the number of free variables when you solve Ax=0—count your free variables, and you've found your nullity
- Counts linearly independent solutions to the homogeneous system, giving you the "size" of the solution space
- Directly tied to rank through the rank-nullity theorem, making it essential for structural analysis of any matrix
Compare: Definition vs. Nullity—the definition tells you what null space is (a set of vectors), while nullity tells you how big it is (a single number). Exam questions often ask you to find nullity, not list the entire null space.
Computing the Null Space
Knowing the definition means nothing if you can't actually find the null space. The process relies on row reduction and identifying which variables are free to take any value.
How to Find the Null Space of a Matrix
- Row reduce A to echelon form and solve Ax=0—pivot columns correspond to basic variables, non-pivot columns to free variables
- Express solutions parametrically by setting each free variable equal to a parameter (like t, s, etc.) and solving for basic variables
- The resulting vectors span the null space—these parametric vectors form a basis, and their count equals the nullity
Null Space in the Context of Homogeneous Systems
- Every homogeneous system Ax=0 has the null space as its complete solution set—no need to find particular solutions
- Solutions are linear combinations of the null space basis vectors, so once you have the basis, you have all solutions
- Always consistent since x=0 is guaranteed to work—the question is whether non-trivial solutions exist
Compare: Finding null space vs. solving non-homogeneous systems—for Ax=0, the null space is the solution set. For Ax=b, you need a particular solution plus the null space. If an exam asks about general solutions, remember this structure.
The null space reveals critical information about how a transformation behaves. It's the kernel of the transformation—the set of inputs that get mapped to nothing.
- Null space equals the kernel of the linear transformation T(x)=Ax—these terms are interchangeable
- Determines injectivity: a transformation is one-to-one if and only if the null space contains only 0
- Measures information loss—larger null space means more distinct inputs collapse to the same output
Null Space and Linear Independence
- Trivial null space (only 0) means the columns of A are linearly independent—no column is a combination of others
- Non-trivial null space indicates linear dependence among columns—the null space vectors explicitly show the dependency relationships
- Nullity counts the "excess" columns beyond what's needed for independence, revealing redundancy in the column set
Compare: Kernel (transformation view) vs. column dependence (matrix view)—same null space, two interpretations. Exam questions might frame it either way, so recognize that "kernel contains only zero" and "columns are independent" are equivalent statements.
Structural Relationships
The null space doesn't exist in isolation—it's connected to rank and column space through precise mathematical relationships. The rank-nullity theorem is the central structural result you must know.
Connection Between Null Space and Rank
- Rank-nullity theorem: rank(A)+nullity(A)=n, where n is the number of columns—this is non-negotiable exam material
- Trade-off relationship: higher rank means lower nullity, and vice versa—the matrix's "power" to map independently vs. its tendency to collapse vectors
- Determines solvability: full column rank (nullity = 0) guarantees unique solutions when solutions exist
Relationship Between Null Space and Column Space
- Complementary dimensions: rank (dimension of column space) plus nullity equals the number of columns
- Orthogonal complement relationship: null space of A is orthogonal to the row space of A (not the column space directly)
- Together describe the transformation completely—column space shows what outputs are possible, null space shows what inputs are "invisible"
Properties of Null Space
- Always a vector space itself—closed under addition and scalar multiplication by definition
- Never empty: the zero vector 0 is always in the null space since A0=0
- Basis vectors are linearly independent solutions to the homogeneous system, providing the most efficient description
Compare: Null space vs. column space—null space lives in Rn (domain), column space lives in Rm (codomain). Their dimensions are linked by rank-nullity, but they exist in different spaces. This distinction frequently appears in conceptual exam questions.
Real-World Applications
Understanding null space isn't just theoretical—it solves practical problems across multiple fields. Anywhere you need to understand what a system "ignores" or what inputs produce no effect, null space is relevant.
Applications of Null Space in Linear Algebra Problems
- Engineering systems analysis: null space identifies inputs that produce no response, critical for understanding system behavior and stability
- Computer graphics: transformations that project 3D objects onto 2D screens have non-trivial null spaces representing the "depth" direction that gets collapsed
- Control theory: null space of system matrices reveals uncontrollable modes—states the system cannot influence
Quick Reference Table
|
| Definition | Set of all x where Ax=0; subspace of domain |
| Nullity | Dimension of null space; equals number of free variables |
| Computation | Row reduce, identify free variables, write parametric solution |
| Injectivity Test | Transformation is one-to-one iff null space = {0} |
| Linear Independence | Columns independent iff null space is trivial |
| Rank-Nullity Theorem | rank(A)+nullity(A)=number of columns |
| Homogeneous Systems | Null space = complete solution set for Ax=0 |
| Column Space Relationship | Dimensions are complementary; exist in different spaces |
Self-Check Questions
-
If a 4×6 matrix has rank 3, what is its nullity? What does this tell you about the linear independence of its columns?
-
Compare and contrast: What's the relationship between a linear transformation having a trivial kernel and its matrix having linearly independent columns?
-
You row reduce a matrix and find 2 free variables. How many vectors will form a basis for the null space, and why?
-
A transformation T:R5→R3 has nullity 2. Can T be injective? Can it be surjective? Explain using rank-nullity.
-
If two matrices A and B have the same null space, must they have the same column space? Justify your answer with the rank-nullity theorem.