A vector space is a set of objects that follows specific rules of addition and multiplication.
These objects are called vectors and the rules are:
1) Commutative and associative addition for all elements of the closed set.
2) Associativity and distributivity of scalar multiplication for all elements of the closed set
where and
are scalars.
3) Scalar multiplication identity.
4) Additive inverse.
5) Existence of null vector , such that
In a vector space , one vector can be expressed as a linear combination of other vectors in the set, e.g.:
The span of a set of vectors is the set of vectors that can be written as a linear combination of vectors in the set . For example, the span of the set of unit vectors
and
in the
space is the set of all vectors (including the null vector) in the
space. Alternative, we say that
and
span
.
If we vary (but not the trivial case where all scalars are zero) such that
is equal to
,
the set of vectors is said to be linearly dependent because any vector can be written as a linear combination of the others:
If the only way to satisfy eq1 is when for all
, the set of vectors is said to be linearly independent. In this case, we can no longer express any vector as a linear combination of the other vectors (as
, resulting in RHS of eq2 being undefined). An example of a set of linearly independent vectors is the set of unit vectors
,
in the
space.
Question
Can a set of linearly independent vectors include the zero vector?
Answer
No, because if and
in eq1, then
can be any number. Since
is not necessarily 0, it contradicts the definition of linear independence.
A set of linearly independent vectors in an
-dimensional vector space
forms a set of basis vectors,
. A complete basis set is formed by a set of basis vectors of
if any vector
in the span of
can be written as a linear combination of those basis vectors, i.e.