Section 3.1 Linear dependence and independence
Let
\(V\) be a vector space over a field
\(F,\) and
\(S\) a subset of
\(V\text{,}\) and suppose that
\(S\) is
linear dependent. Prove an alternate characterization of linear dependence in the following exercise.
Checkpoint 3.1.1. Alternate characterization of linear dependence.
Show that \(S\) is a linearly dependent subset of a vector space \(V\) if and only if there is a proper subset \(S_0 \subsetneq S\) with \(\Span(S_0) = \Span(S).\)
Hint.
Given a nontrivial linear combination of vectors in \(S\) equaling zero, you can solve for the vector with the nonzero coefficient in terms of the remaining vectors.
Put more colloquially, if a subspace \(W = \Span(S)\text{,}\) and \(S\) is linearly dependent, then you can throw away one vector from \(S\) to produce a proper subset \(S_0\) with \(W = \Span(S_0).\) As a consequence we have the following theorem.
Theorem 3.1.2.
Given a vector space \(V\text{,}\) any spanning set for \(V\) can be reduced to a linearly independent spanning set, i.e., a basis for \(V.\)
As a counterpoint, we have a statement about constructing linearly independent sets.
Theorem 3.1.3.
Let \(S\) be a linearly independent subset of a vector space \(V\text{,}\) and let \(W = \Span(S).\) If \(W\ne V\) (that is, \(W\) is a proper subspace of \(V\)), then there exists a vector \(v_0 \in V\setminus W\) (in \(V\) but not in \(W\)), so that \(S' = S\cup \{v_0\}\) is a linearly independent subset of \(V.\)
Proof.
As a hint, note that \(S\) is assumed linearly independent, so if \(S\cup \{v_0\}\) was linearly dependent, it would force \(v_0 \in \Span(S)\) (why?), contrary to the assumption.
For the moment, we suppose that we know the dimension of a vector space \(V,\) say \(\dim V =n.\) We give another synopsis of the results above.
Theorem 3.1.5. Constructing bases.
Let \(V\) have finite dimension \(n,\) and let \(S\subset V.\)
If \(S\) is linearly independent, then \(\#S \le
n\text{,}\) and if \(\#S \lt n \text{,}\) then \(S\) can be extended to a basis for \(V\text{,}\) that is there is a finite subset \(T\) of \(V\text{,}\) so that \(S\cup T\) is a basis for \(V.\)
If \(\#S \gt n\text{,}\) then \(S\) is linearly dependent, and there is a subset \(S_0\subsetneq S\) which is linearly independent and for which \(\Span(S_0) = \Span(S).\)
In more colloquial terms, any linearly independent subset of \(V\) can be extended to a basis for \(V,\) and any spanning set can be reduced to produce a basis.
As a consequence of the above, we have another important theorem.
Theorem 3.1.6.
Let \(V\) be a vector space with finite dimension \(n.\) Then
Proof.
The proofs are straightforward from the above since if a set of \(n\) linearly independent vectors in \(V\) did not span, you could add a vector to the set of \(n\) and obtain an independent set with \(n+1\) elements. Similarly, if \(n\) elements spanned \(V\) but were not independent, you could eliminate one giving a basis with too few elements.
Before going farther, we should make sure our intuition is on point.
Exercises Exercises
1.
Let \(A\) be an \(m\times n\) matrix. Its row space is the span of the rows of \(A\) and so is a subspace of \(F^n\text{.}\) Its column space is the span of its columns and so is a subspace of \(F^m.\)
Can any given column of a matrix always be used as part of a basis for the column space?
Hint.
Under what conditions is a set with one vector a linearly independent subset of the vector space?
Answer.
Any column of a matrix which is not the column of all zeros can be used as part of the basis of the column space since the single nonzero column is a linearly independent set.
2.
Suppose the first two columns of a matrix are nonzero. What is an easy way to check that both columns can be part of a basis for the column space?
Hint.
What does the notion of linear dependence reduce to in the case of two vectors?
Answer.
Two columns which are not multiples of one another may be used as part of the basis for the column space.
3.
Do you think there is an easy way to determine if the first three nonzero columns of a matrix can be part of a basis for the column space?
Hint.
Easy may be in the eye of the beholder.
Answer.
Not typically by inspection. Given the first two columns are linearly independent, one needs to know the third is not a linear combination of the first two. In
Section 4.2 we provide answers using either elementary column operations, or perhaps surprisingly elementary row operations.