Abstract Linear Algebra II Unit 1 – Vector Spaces and Subspaces

Vector spaces and subspaces form the foundation of linear algebra. These structures provide a framework for understanding linear combinations, spanning sets, and linear independence. By mastering these concepts, students gain insight into the fundamental properties of vector operations and their applications. Basis and dimension are key ideas that build upon vector space theory. These concepts allow us to characterize vector spaces, determine their size, and represent vectors uniquely. Understanding basis and dimension is crucial for solving linear systems, analyzing linear transformations, and exploring more advanced topics in linear algebra.

Key Concepts and Definitions

  • Vector space consists of a set VV and two operations (addition and scalar multiplication) satisfying specific axioms
  • Subspace is a subset of a vector space that is closed under addition and scalar multiplication
  • Linear combination expresses a vector as the sum of scalar multiples of other vectors
  • Span of a set of vectors is the set of all linear combinations of those vectors
  • Linearly independent set has no vector that can be expressed as a linear combination of the others
  • Linearly dependent set contains at least one vector that is a linear combination of the others
  • Basis is a linearly independent set that spans the entire vector space
  • Dimension of a vector space is the number of vectors in any basis for that space

Vector Space Fundamentals

  • Vector space axioms include closure, associativity, commutativity, identity, inverse, and distributivity properties
  • Examples of vector spaces include Rn\mathbb{R}^n, Cn\mathbb{C}^n, and the set of all polynomials of degree n\leq n
  • Zero vector serves as the additive identity in a vector space
  • Scalar multiplication distributes over vector addition a(u+v)=au+ava(\vec{u} + \vec{v}) = a\vec{u} + a\vec{v}
  • Vector addition distributes over scalar addition (a+b)v=av+bv(a + b)\vec{v} = a\vec{v} + b\vec{v}
  • Scalar multiplication is compatible with field multiplication a(bv)=(ab)va(b\vec{v}) = (ab)\vec{v}
  • Multiplicative identity property states 1v=v1\vec{v} = \vec{v} for all v\vec{v} in the vector space

Subspace Theory

  • Subspace inherits the vector space structure from its parent space
  • Trivial subspaces include the zero vector space {0}\{0\} and the entire vector space VV
  • Sum of two subspaces U+W={u+w:uU,wW}U + W = \{\vec{u} + \vec{w} : \vec{u} \in U, \vec{w} \in W\} is also a subspace
    • Example: even functions and odd functions form subspaces of the vector space of all functions
  • Intersection of two subspaces UW={v:vU and vW}U \cap W = \{\vec{v} : \vec{v} \in U \text{ and } \vec{v} \in W\} is also a subspace
  • Direct sum of two subspaces UWU \oplus W requires UW={0}U \cap W = \{0\}
  • Subspace test verifies closure under addition and scalar multiplication
    • Example: set of all polynomials with real coefficients of degree 2\leq 2 is a subspace of R[x]\mathbb{R}[x]

Linear Combinations and Span

  • Linear combination of vectors v1,v2,,vn\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n is a1v1+a2v2++anvna_1\vec{v}_1 + a_2\vec{v}_2 + \cdots + a_n\vec{v}_n, where a1,a2,,ana_1, a_2, \ldots, a_n are scalars
  • Span of a set of vectors S={v1,v2,,vn}S = \{\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n\} is denoted as span(S)\text{span}(S) or span(v1,v2,,vn)\text{span}(\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n)
    • Example: span((1,0),(0,1))=R2\text{span}((1, 0), (0, 1)) = \mathbb{R}^2
  • Spanning set for a vector space VV is a set of vectors whose span is equal to VV
  • Column space of a matrix AA is the span of its column vectors
  • Row space of a matrix AA is the span of its row vectors

Linear Independence and Dependence

  • Linearly independent set has no vector that can be expressed as a linear combination of the others
    • Example: standard basis vectors e1,e2,,en\vec{e}_1, \vec{e}_2, \ldots, \vec{e}_n are linearly independent
  • Linearly dependent set contains at least one vector that is a linear combination of the others
  • Unique representation property states that every vector in a linearly independent set has a unique representation as a linear combination of the set's vectors
  • Linear dependence lemma states that a set {v1,v2,,vn}\{\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n\} is linearly dependent if and only if some vi\vec{v}_i is a linear combination of the preceding vectors v1,v2,,vi1\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_{i-1}
  • Linearly dependent set cannot be a basis for a vector space

Basis and Dimension

  • Basis is a linearly independent set that spans the entire vector space
    • Example: standard basis for Rn\mathbb{R}^n is {e1,e2,,en}\{\vec{e}_1, \vec{e}_2, \ldots, \vec{e}_n\}, where ei\vec{e}_i has a 1 in the ii-th position and 0s elsewhere
  • Dimension of a vector space is the number of vectors in any basis for that space
  • Finite-dimensional vector space has a finite basis, while an infinite-dimensional vector space has an infinite basis
  • Coordinate vector of v\vec{v} with respect to a basis BB is the unique linear combination of basis vectors that equals v\vec{v}
  • Change of basis matrix transforms coordinates from one basis to another
  • Rank-nullity theorem states that for a linear map T:VWT: V \to W, dim(V)=dim(null(T))+dim(range(T))\dim(V) = \dim(\text{null}(T)) + \dim(\text{range}(T))

Applications and Examples

  • Function spaces, such as the space of continuous functions C[a,b]C[a, b] or the space of square-integrable functions L2[a,b]L^2[a, b], are infinite-dimensional vector spaces
  • Polynomial vector spaces R[x]\mathbb{R}[x] or C[x]\mathbb{C}[x] have bases consisting of monomials {1,x,x2,}\{1, x, x^2, \ldots\}
  • Solution space of a homogeneous linear system Ax=0A\vec{x} = \vec{0} is a subspace of Rn\mathbb{R}^n (or Cn\mathbb{C}^n)
    • Example: solution space of x1+2x2x3=0x_1 + 2x_2 - x_3 = 0 is a plane through the origin in R3\mathbb{R}^3
  • Eigen spaces corresponding to eigenvalues of a linear operator are subspaces of the domain
  • Kernel (null space) and image (range) of a linear transformation are subspaces of the domain and codomain, respectively

Common Pitfalls and Tips

  • Subspace test requires closure under both addition and scalar multiplication, not just one or the other
  • Linear independence is not the same as orthogonality; vectors can be linearly independent without being orthogonal
  • Basis vectors are not unique; a vector space can have multiple bases
  • Dimension is a property of the vector space, not the choice of basis
  • Coordinate vectors depend on the choice of basis; changing the basis changes the coordinates
  • Linearly dependent set cannot span the entire vector space, as it will miss some vectors
  • Linearly independent spanning set is a basis; linearly independent non-spanning set is not a basis
  • When working with abstract vector spaces, focus on the properties and axioms rather than the specific nature of the elements


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.