Section 3.4 Using Sage to answer questions of independence and dependence
In the previous section, we have outlined ways in which to build independent sets in \(F^m,\) and how to determine dependencies among a set of vectors. Here we use Sage for these tasks. Necessarily there is some redundancy since solving most of the problems we pose can be viewed from multiple perspectives.
Subsection 3.4.1 Using Sage to check if a set of vectors in linearly independent
We know that if we have \(n>m\) vectors in \(F^m,\) they are automatically linearly dependent, but what if we have \(n \le
m\) vectors in \(F^m\text{?}\) They can be linearly independent or dependent. How can we test them? Here one must exercise some care. If the vectors are dependent, there is not necessarily a unique choice of a linearly independent subset.
Exercise. Suppose that \(v,w\) are nonzero vectors in a vector space \(V,\) and \(v\) is not a scalar multiple of \(w.\) It follows that \(\{v,w\}\) is a linearly independent subset of \(V.\) Show that \(S=\{v, w, v+w\}\) is a linearly dependent subset with the property that any subset of two elements of \(S\) is linearly independent.
Let’s consider a \(4\times
4\) matrix with the linear dependencies among columns hopefully evident.
Now since the columns are linearly dependent (by the observation above), there are nontrivial solutions to the matrix equation \(Ax=\0.\) Not surprisingly they have the form \(-5\)column(1) + column(3) = 0 and \(-2\)column(2)+ column(4) = 0. In Sage we see this by:
We could have also derived this information from the RREF:
But pivots also tell us (at least one set of) linearly independent columns. Sage says
Remember that Sage (Python) counts all arrays starting with zero, so this means the first and second column of \(A\) are linearly independent.
Sage can list them for us (as row vectors):
Finally, Sage has a
pivot_rows
function which returns the pivot row positions for this matrix, which are a topmost subset of the rows that span the row space and are linearly independent. So here will will see the topmost rows which are linearly independent are the first and third.Subsection 3.4.2 Using Sage to check if a vector is in the span of a set
Suppose we are given vectors \(S=\{v_1, v_2, \dots, v_n\}
\subset F^m\text{,}\) a vector \(b \in F^m,\) and we want to know whether \(b\in \Span(S).\)
There are certainly different approaches, some depending on a knowledge of whether \(S\) is a linearly independent set, but let’s give a simple one based on Observation 1.3.2.
That observation suggests we enter our vectors as the column vectors of a matrix \(A\text{,}\) but Sage seems to like things presented as rows (for compact notation). No problem. We’ll build a matrix \(B\) whose rows are the \(v_i\text{,}\) and let \(A = B^t,\) the transpose of \(B.\) Then \(b \in \Span(S)\) if and only if \(Ax=b\) is solvable.
So let \(S =
\{v_1=(1,2,3,4),v_2=(5,6,7,8),v_3=(9,10,11,12)\}.\) We enter the vectors as rows of \(B.\)
We turn the rows into columns via the tranpose.
Pick a vector \(b.\)
Is \(b\) in the column space?
Apparently so; give us a linear combination of the columns which equals \(b.\)
This says the \(b\) is \(9/4\) times the first column minus \(1/4\) times the second. In particular, \(b\) is in the span of the first two columns.
Note that we also could have determined that \(b\) is in the column space by row reducing the augmented matrix \([A|b]:\)
We might ask if the columns of \(A\) are linearly independent? Remember, the columns of a matrix \(A\) are linearly independent if and only if \(Ax=0\) has only the trivial solution.
No they are not; the dependence relation coefficients are above. So to double check, let’s make a matrix from the first two columns of \(A.\)
\(b\) will be in the column space as we saw above, but we check anyway.
Let’s pick another vector
Is \(c\) in the column space of C?
It is not, so that should mean if we add it to \(\{v_1, v_2\}\text{,}\) we should get a linearly independent set. So let’s augment the matrix \(C\) with this new vector.
And we note that all three columns are pivot columns, hence linearly independent.
Playground space (Enter your own commands).
Subsection 3.4.3 Using Sage to understand the row and column space
Let’s look at how to use Sage to reveal (minimal) spanning sets for the row and column space of a matrix. Let’s start with a \(4\times 5\) matrix \(A\) with coefficients in \(\Q\text{,}\) actually in \(\Z\text{,}\) but for most things in linear algebra, we want to work over a field.
It is always informative to know its reduced row-echelon form
Let’s focus on the RREF and recall that there are a number of related concepts surrounding the notion of a pivot position/entryin a matrix.
One connection is focused on Gaussian elimination to make the leading entry of a nonzero row equal to one. The pivot positions in the matrix \(A\) are the positions ( (row,column) ) where a leading one occurs in the RREF of \(A.\)
While we know that we can take the nonzero rows of the \(RREF(A)\) of a matrix to span the row space of \(A\text{,}\) the column space is more subtle. Of course one can take all the columns of \(A\) to be a spanning set for the column space, but it is not necessarily minimal. Below we ask Sage for the column space of \(A\text{,}\) but Sage gives it to you as the span of a nice set of vectors in the column space. How do you think those vectors were obtained?
It might be more insightful to see columns of \(A\) which span the column space. Note this set is not unique, but the columns which occur here are the so-called pivot columns.
Check that the columns listed (as row vectors) are columns of \(A\) which correspond the the pivots.