➗Abstract Linear Algebra II Unit 1 – Vector Spaces and Subspaces
Vector spaces and subspaces form the foundation of linear algebra. These structures provide a framework for understanding linear combinations, spanning sets, and linear independence. By mastering these concepts, students gain insight into the fundamental properties of vector operations and their applications.
Basis and dimension are key ideas that build upon vector space theory. These concepts allow us to characterize vector spaces, determine their size, and represent vectors uniquely. Understanding basis and dimension is crucial for solving linear systems, analyzing linear transformations, and exploring more advanced topics in linear algebra.
Vector space consists of a set V and two operations (addition and scalar multiplication) satisfying specific axioms
Subspace is a subset of a vector space that is closed under addition and scalar multiplication
Linear combination expresses a vector as the sum of scalar multiples of other vectors
Span of a set of vectors is the set of all linear combinations of those vectors
Linearly independent set has no vector that can be expressed as a linear combination of the others
Linearly dependent set contains at least one vector that is a linear combination of the others
Basis is a linearly independent set that spans the entire vector space
Dimension of a vector space is the number of vectors in any basis for that space
Vector Space Fundamentals
Vector space axioms include closure, associativity, commutativity, identity, inverse, and distributivity properties
Examples of vector spaces include Rn, Cn, and the set of all polynomials of degree ≤n
Zero vector serves as the additive identity in a vector space
Scalar multiplication distributes over vector addition a(u+v)=au+av
Vector addition distributes over scalar addition (a+b)v=av+bv
Scalar multiplication is compatible with field multiplication a(bv)=(ab)v
Multiplicative identity property states 1v=v for all v in the vector space
Subspace Theory
Subspace inherits the vector space structure from its parent space
Trivial subspaces include the zero vector space {0} and the entire vector space V
Sum of two subspaces U+W={u+w:u∈U,w∈W} is also a subspace
Example: even functions and odd functions form subspaces of the vector space of all functions
Intersection of two subspaces U∩W={v:v∈U and v∈W} is also a subspace
Direct sum of two subspaces U⊕W requires U∩W={0}
Subspace test verifies closure under addition and scalar multiplication
Example: set of all polynomials with real coefficients of degree ≤2 is a subspace of R[x]
Linear Combinations and Span
Linear combination of vectors v1,v2,…,vn is a1v1+a2v2+⋯+anvn, where a1,a2,…,an are scalars
Span of a set of vectors S={v1,v2,…,vn} is denoted as span(S) or span(v1,v2,…,vn)
Example: span((1,0),(0,1))=R2
Spanning set for a vector space V is a set of vectors whose span is equal to V
Column space of a matrix A is the span of its column vectors
Row space of a matrix A is the span of its row vectors
Linear Independence and Dependence
Linearly independent set has no vector that can be expressed as a linear combination of the others
Example: standard basis vectors e1,e2,…,en are linearly independent
Linearly dependent set contains at least one vector that is a linear combination of the others
Unique representation property states that every vector in a linearly independent set has a unique representation as a linear combination of the set's vectors
Linear dependence lemma states that a set {v1,v2,…,vn} is linearly dependent if and only if some vi is a linear combination of the preceding vectors v1,v2,…,vi−1
Linearly dependent set cannot be a basis for a vector space
Basis and Dimension
Basis is a linearly independent set that spans the entire vector space
Example: standard basis for Rn is {e1,e2,…,en}, where ei has a 1 in the i-th position and 0s elsewhere
Dimension of a vector space is the number of vectors in any basis for that space
Finite-dimensional vector space has a finite basis, while an infinite-dimensional vector space has an infinite basis
Coordinate vector of v with respect to a basis B is the unique linear combination of basis vectors that equals v
Change of basis matrix transforms coordinates from one basis to another
Rank-nullity theorem states that for a linear map T:V→W, dim(V)=dim(null(T))+dim(range(T))
Applications and Examples
Function spaces, such as the space of continuous functions C[a,b] or the space of square-integrable functions L2[a,b], are infinite-dimensional vector spaces
Polynomial vector spaces R[x] or C[x] have bases consisting of monomials {1,x,x2,…}
Solution space of a homogeneous linear system Ax=0 is a subspace of Rn (or Cn)
Example: solution space of x1+2x2−x3=0 is a plane through the origin in R3
Eigen spaces corresponding to eigenvalues of a linear operator are subspaces of the domain
Kernel (null space) and image (range) of a linear transformation are subspaces of the domain and codomain, respectively
Common Pitfalls and Tips
Subspace test requires closure under both addition and scalar multiplication, not just one or the other
Linear independence is not the same as orthogonality; vectors can be linearly independent without being orthogonal
Basis vectors are not unique; a vector space can have multiple bases
Dimension is a property of the vector space, not the choice of basis
Coordinate vectors depend on the choice of basis; changing the basis changes the coordinates
Linearly dependent set cannot span the entire vector space, as it will miss some vectors
Linearly independent spanning set is a basis; linearly independent non-spanning set is not a basis
When working with abstract vector spaces, focus on the properties and axioms rather than the specific nature of the elements