Vector spaces and linear independence are key concepts in linear algebra. They provide a foundation for understanding the structure of mathematical systems used in coding theory.
These concepts help us analyze and manipulate data in multiple dimensions. By grasping vector spaces and linear independence, we can better understand error-correcting codes and their properties.
Vector Spaces and Subspaces
Definition and Properties of Vector Spaces
- Vector space consists of a set of vectors and two operations (vector addition and scalar multiplication) that satisfy certain axioms
- Closure under vector addition: Adding any two vectors in results in another vector in
- Closure under scalar multiplication: Multiplying any vector in by a scalar (real or complex number) results in another vector in
- Associativity of vector addition: for all vectors , , and in
- Commutativity of vector addition: for all vectors and in
- Existence of additive identity: There exists a unique vector (zero vector) such that for all vectors in
- Existence of additive inverses: For every vector in , there exists a unique vector such that
- Examples of vector spaces include
- : The set of all -tuples of real numbers
- : The set of all -tuples of complex numbers
- : The set of all polynomials of degree at most
Subspaces and Their Properties
- Subspace is a non-empty subset of a vector space that is itself a vector space under the same operations as
- Closure under vector addition: If and are in , then is also in
- Closure under scalar multiplication: If is in and is a scalar, then is also in
- Examples of subspaces include
- The zero vector space is a subspace of any vector space
- The set of all polynomials of degree at most is a subspace of for
- To prove a subset is a subspace, show it satisfies the subspace properties or use the subspace test
- Subspace test: A non-empty subset of a vector space is a subspace if and only if for any and any scalar , we have and
Linear Combinations and Span
- Linear combination of vectors in a vector space is a vector of the form , where are scalars
- Coefficients can be any scalars (real or complex numbers)
- Example: In , if , , and , then is a linear combination of , , and with coefficients , , and
- Span of a set of vectors in a vector space is the set of all linear combinations of these vectors
- Denoted as or
- Span is always a subspace of
- Example: In , the span of and is the -plane
Basis and Dimension
Basis of a Vector Space
- Basis of a vector space is a linearly independent set of vectors that spans
- Linearly independent: No vector in the set can be written as a linear combination of the other vectors
- Spans : Every vector in can be written as a linear combination of the basis vectors
- Examples of bases include
- Standard basis for : , where has a 1 in the -th position and 0s elsewhere
- Basis for :
- Every vector space has a basis, and any two bases of a vector space have the same number of elements

Dimension of a Vector Space
- Dimension of a vector space is the number of vectors in any basis of
- Denoted as
- All bases of a vector space have the same number of elements, so the dimension is well-defined
- Examples of dimensions include
- Dimension provides a measure of the "size" of a vector space
- Finite-dimensional vector space: A vector space with a finite basis (and thus a finite dimension)
- Infinite-dimensional vector space: A vector space that is not finite-dimensional
Coordinate Vectors
- Coordinate vector of a vector with respect to a basis is the unique -tuple such that
- Represents as a linear combination of the basis vectors
- Uniqueness follows from the linear independence of the basis vectors
- Example: In with the standard basis , the coordinate vector of is
- Coordinate vectors provide a way to represent vectors in terms of a basis
- Allows for computation and analysis using the coordinates rather than the vectors themselves
- Coordinate vectors depend on the chosen basis
Linear Independence
Definition and Properties of Linear Independence
- Set of vectors in a vector space is linearly independent if the equation has only the trivial solution
- Equivalent to saying that no vector in the set can be written as a linear combination of the other vectors
- Example: In , the vectors , , and are linearly independent
- Set of vectors is linearly dependent if it is not linearly independent
- Equivalent to saying that at least one vector in the set can be written as a linear combination of the other vectors
- Example: In , the vectors , , and are linearly dependent, as
- Properties of linearly independent sets
- Any subset of a linearly independent set is linearly independent
- If a set contains the zero vector, it is linearly dependent
- In a vector space of dimension , any set of more than vectors is linearly dependent
Determining Linear Independence
- To determine if a set of vectors is linearly independent, solve the equation for the coefficients
- If the only solution is the trivial solution , the set is linearly independent
- If there are non-trivial solutions, the set is linearly dependent
- Example: To determine if the vectors , , and in are linearly independent, solve the equation
- This leads to a system of linear equations, which can be solved using techniques like Gaussian elimination
- In this case, the only solution is , so the vectors are linearly independent
Importance of Linear Independence in Bases
- Linear independence is a crucial property for bases of vector spaces
- Ensures that each vector in the basis is not redundant and contributes to spanning the entire vector space
- Guarantees that every vector in the vector space has a unique representation as a linear combination of the basis vectors
- In a finite-dimensional vector space, a set of vectors is a basis if and only if it is linearly independent and spans the vector space
- Linear independence ensures that the set is not "too large"
- Spanning ensures that the set is "large enough" to generate all vectors in the space
- Example: In , the standard basis is linearly independent and spans , making it a basis for the vector space