All Study Guides Linear Algebra and Differential Equations Unit 3
➗ Linear Algebra and Differential Equations Unit 3 – Vector SpacesVector spaces form the foundation of linear algebra, providing a framework for understanding and manipulating mathematical objects. They generalize the concept of vectors beyond physical space, allowing us to work with abstract elements that behave like vectors.
Key properties of vector spaces include closure under addition and scalar multiplication, associativity, commutativity, and the existence of zero and inverse elements. These properties enable us to perform operations and analyze relationships between vectors, forming the basis for more advanced concepts in linear algebra.
What's a Vector Space?
Algebraic structure consisting of a set of elements called vectors and two operations: vector addition and scalar multiplication
Vectors can be added together and multiplied by scalars (real or complex numbers) to produce another vector in the space
Must satisfy certain axioms (properties) to qualify as a vector space
Generalizes the notion of Euclidean vectors (e.g., 2D or 3D vectors) to higher dimensions and abstract settings
Examples include:
R n \mathbb{R}^n R n : n-dimensional real vector space
C n \mathbb{C}^n C n : n-dimensional complex vector space
Space of polynomials of degree ≤ n \leq n ≤ n
Space of continuous functions on an interval
Key Properties and Axioms
Closure under vector addition: If u ⃗ \vec{u} u and v ⃗ \vec{v} v are vectors in the space, then u ⃗ + v ⃗ \vec{u} + \vec{v} u + v is also a vector in the space
Closure under scalar multiplication: If v ⃗ \vec{v} v is a vector and c c c is a scalar, then c v ⃗ c\vec{v} c v is also a vector in the space
Associativity of vector addition: ( u ⃗ + v ⃗ ) + w ⃗ = u ⃗ + ( v ⃗ + w ⃗ ) (\vec{u} + \vec{v}) + \vec{w} = \vec{u} + (\vec{v} + \vec{w}) ( u + v ) + w = u + ( v + w )
Commutativity of vector addition: u ⃗ + v ⃗ = v ⃗ + u ⃗ \vec{u} + \vec{v} = \vec{v} + \vec{u} u + v = v + u
Existence of zero vector: There exists a unique vector 0 ⃗ \vec{0} 0 such that v ⃗ + 0 ⃗ = v ⃗ \vec{v} + \vec{0} = \vec{v} v + 0 = v for all vectors v ⃗ \vec{v} v
Existence of additive inverse: For every vector v ⃗ \vec{v} v , there exists a unique vector − v ⃗ -\vec{v} − v such that v ⃗ + ( − v ⃗ ) = 0 ⃗ \vec{v} + (-\vec{v}) = \vec{0} v + ( − v ) = 0
Distributivity of scalar multiplication over vector addition: c ( u ⃗ + v ⃗ ) = c u ⃗ + c v ⃗ c(\vec{u} + \vec{v}) = c\vec{u} + c\vec{v} c ( u + v ) = c u + c v
Distributivity of scalar multiplication over field addition: ( c + d ) v ⃗ = c v ⃗ + d v ⃗ (c + d)\vec{v} = c\vec{v} + d\vec{v} ( c + d ) v = c v + d v
Types of Vector Spaces
Real vector spaces: Vectors with real number components and scalars from the real number field
Examples: R 2 \mathbb{R}^2 R 2 (2D plane), R 3 \mathbb{R}^3 R 3 (3D space)
Complex vector spaces: Vectors with complex number components and scalars from the complex number field
Example: C 3 \mathbb{C}^3 C 3 (3D space with complex components)
Function spaces: Vectors are functions and operations are pointwise
Examples: space of continuous functions, space of differentiable functions
Polynomial spaces: Vectors are polynomials and operations are polynomial addition and scalar multiplication
Example: space of polynomials of degree ≤ 3 \leq 3 ≤ 3
Matrix spaces: Vectors are matrices and operations are matrix addition and scalar multiplication
Example: space of 2 × 2 2 \times 2 2 × 2 real matrices
Subspaces and Span
Subspace: A subset of a vector space that is itself a vector space under the same operations
Must contain the zero vector and be closed under vector addition and scalar multiplication
Examples: lines and planes passing through the origin in R 3 \mathbb{R}^3 R 3
Span: The set of all linear combinations of a given set of vectors
Linear combination: A sum of scalar multiples of vectors, c 1 v ⃗ 1 + c 2 v ⃗ 2 + ⋯ + c n v ⃗ n c_1\vec{v}_1 + c_2\vec{v}_2 + \cdots + c_n\vec{v}_n c 1 v 1 + c 2 v 2 + ⋯ + c n v n
Spanning set: A set of vectors whose span is the entire vector space
Examples:
Span of { ( 1 , 0 ) , ( 0 , 1 ) } \{(1, 0), (0, 1)\} {( 1 , 0 ) , ( 0 , 1 )} is R 2 \mathbb{R}^2 R 2
Span of { 1 , x , x 2 } \{1, x, x^2\} { 1 , x , x 2 } is the space of polynomials of degree ≤ 2 \leq 2 ≤ 2
Linear Independence and Dependence
Linearly independent: A set of vectors is linearly independent if no vector can be written as a linear combination of the others
Equivalently, the only solution to c 1 v ⃗ 1 + c 2 v ⃗ 2 + ⋯ + c n v ⃗ n = 0 ⃗ c_1\vec{v}_1 + c_2\vec{v}_2 + \cdots + c_n\vec{v}_n = \vec{0} c 1 v 1 + c 2 v 2 + ⋯ + c n v n = 0 is c 1 = c 2 = ⋯ = c n = 0 c_1 = c_2 = \cdots = c_n = 0 c 1 = c 2 = ⋯ = c n = 0
Example: { ( 1 , 0 ) , ( 0 , 1 ) } \{(1, 0), (0, 1)\} {( 1 , 0 ) , ( 0 , 1 )} is linearly independent in R 2 \mathbb{R}^2 R 2
Linearly dependent: A set of vectors is linearly dependent if at least one vector can be written as a linear combination of the others
Equivalently, there exists a non-trivial solution to c 1 v ⃗ 1 + c 2 v ⃗ 2 + ⋯ + c n v ⃗ n = 0 ⃗ c_1\vec{v}_1 + c_2\vec{v}_2 + \cdots + c_n\vec{v}_n = \vec{0} c 1 v 1 + c 2 v 2 + ⋯ + c n v n = 0
Example: { ( 1 , 0 ) , ( 0 , 1 ) , ( 1 , 1 ) } \{(1, 0), (0, 1), (1, 1)\} {( 1 , 0 ) , ( 0 , 1 ) , ( 1 , 1 )} is linearly dependent in R 2 \mathbb{R}^2 R 2 since ( 1 , 1 ) = ( 1 , 0 ) + ( 0 , 1 ) (1, 1) = (1, 0) + (0, 1) ( 1 , 1 ) = ( 1 , 0 ) + ( 0 , 1 )
Importance: Linearly independent sets are crucial for defining bases and dimensions of vector spaces
Basis and Dimension
Basis: A linearly independent spanning set for a vector space
Every vector in the space can be uniquely expressed as a linear combination of basis vectors
Examples:
Standard basis for R 2 \mathbb{R}^2 R 2 : { ( 1 , 0 ) , ( 0 , 1 ) } \{(1, 0), (0, 1)\} {( 1 , 0 ) , ( 0 , 1 )}
Standard basis for the space of polynomials of degree ≤ 2 \leq 2 ≤ 2 : { 1 , x , x 2 } \{1, x, x^2\} { 1 , x , x 2 }
Dimension: The number of vectors in a basis for a vector space
All bases for a vector space have the same number of vectors
Examples:
Dimension of R n \mathbb{R}^n R n is n n n
Dimension of the space of polynomials of degree ≤ n \leq n ≤ n is n + 1 n+1 n + 1
Importance: Bases and dimensions provide a way to represent and analyze vector spaces in a standardized manner
Coordinate Systems
Coordinate system: A way to represent vectors in a vector space using a basis
Each vector is identified by a unique tuple of scalars (coordinates) corresponding to the basis vectors
Example: In R 2 \mathbb{R}^2 R 2 with standard basis { ( 1 , 0 ) , ( 0 , 1 ) } \{(1, 0), (0, 1)\} {( 1 , 0 ) , ( 0 , 1 )} , the vector ( 3 , 4 ) (3, 4) ( 3 , 4 ) has coordinates 3 3 3 and 4 4 4
Change of basis: Expressing vectors in terms of a different basis
Useful for simplifying computations or highlighting certain properties of vectors
Achieved through matrix multiplication: [ v ⃗ ] B = P [ v ⃗ ] A [\vec{v}]_B = P[\vec{v}]_A [ v ] B = P [ v ] A , where P P P is the change of basis matrix
Importance: Coordinate systems allow for the numerical representation and manipulation of vectors in a given basis
Applications in Linear Algebra
Systems of linear equations: Vector spaces provide a framework for solving and analyzing systems of linear equations
Solutions form a subspace (solution space) of the vector space
Basis vectors of the solution space give parametric form of solutions
Linear transformations: Functions between vector spaces that preserve vector addition and scalar multiplication
Can be represented by matrices in given bases
Eigenvalues and eigenvectors of the matrix provide insights into the transformation's behavior
Least squares approximation: Finding the best-fitting linear model for a set of data points
Minimizes the sum of squared distances between data points and the model
Solution involves orthogonal projection onto a subspace spanned by the model's basis functions
Principal component analysis (PCA): Technique for dimensionality reduction and data compression
Identifies the directions (principal components) of maximum variance in the data
Projects data onto a lower-dimensional subspace spanned by the principal components