Skip to main content
Logo image

Section 4.6 Exercises (with solutions)

Exercises Exercises

1.

The matrix \(B=\ba{rrr}1 \amp 4 \amp -7 \\ -3 \amp -11 \amp 19 \\ -1 \amp -9 \amp 18\ea\) is invertible with inverse \(B^{-1}=\ba{rrr} -27 \amp -9 \amp -1 \\ 35 \amp 11 \amp 2 \\ 16 \amp 5 \amp 1 \ea \text{.}\) Since the columns of \(B\) are linearly independent, they form a basis for \(\R^3:\)
\begin{equation*} \cB=\left\{\ba{r}1\\ -3\\ -1\ea, \ba{r}4\\ -11\\ -9\ea,\ba{r}-7\\ 19\\ 18\ea\right\}. \end{equation*}
Let \(\cE\) be the standard basis for \(\R^3.\)
(a)
Suppose that a vector \(v\in \R^3\) has coordinate vector \([v]_\cB = \ba{r}1\\2\\3\ea.\)
Find\([v]_\cE.\)
Solution.
The matrix \(B\) is the change of basis matrix \([I]_\cB^\cE\) so
\begin{equation*} [v]_\cE = [I]_\cB^\cE [v]_\cB = \ba{rrr}1\amp4\amp-7 \\ -3 \amp -11 \amp 19 \\ -1 \amp -9 \amp 18\ea\ba{r}1\\2\\3\ea = \ba{r}-12\\32\\35\ea \end{equation*}
(b)
Suppose that \(T:\R^3\to \R^3\) is the linear map given by \(T(x) = Ax\) where
\begin{equation*} A = [T]_\cE = \ba{rrr}1\amp 2\amp3\\4\amp5\amp6\\7\amp8\amp9\ea. \end{equation*}
Write down an appropriate product of matrices which equal \([T]_\cB.\)
Solution.
\begin{equation*} [T]_\cB = [I]_\cE^\cB [T]_\cE [I]_\cB^\cE = B^{-1}AB. \end{equation*}

2.

Let \(W\) be the subspace of \(M_2(\R)\) spanned by the set \(S\text{,}\) where
\begin{equation*} S=\left\{\ba{rr}0\amp -1\\-1\amp 1\ea, \ba{rr}1\amp 2\\2\amp 3\ea, \ba{rr}2\amp 1\\1\amp 9\ea, \ba{rrr}1\amp -2\\-2\amp 4\ea\right\}. \end{equation*}
(a)
Use the standard basis \(\cB=\{E_{11}, E_{12}, E_{21}, E_{22}\}\) for \(M_2(\R)\) to express each element of \(S\) as a coordinate vector with respect to the basis \(\cB.\)
Solution.
We write the coordinate vectors as columns of the matrix:
\begin{equation*} \ba{rrrr}0\amp 1\amp 2\amp 1\\ -1\amp 2\amp 1\amp -2\\ -1\amp 2\amp 1\amp -2\\ 1\amp 3\amp 9\amp 4\ea. \end{equation*}
(b)
Determine a basis for \(W.\)
Hint.
By staring at the matrix, it is immediate that that rank is at most 3. What are the pivots?
Solution.
We start a row reduction:
\begin{align*} A\amp \mapsto \ba{rrrr}0\amp 1\amp 2\amp 1\\ -1\amp 2\amp 1\amp -2\\ 1\amp 3\amp 9\amp 4\\ 0\amp0\amp0\amp0\ea\mapsto \ba{rrrr} 1\amp 3\amp 9\amp 4\\ 0\amp 1\amp 2\amp 1\\ -1\amp 2\amp 1\amp -2\\ 0\amp0\amp0\amp0\ea\\ \amp\mapsto \ba{rrrr} 1\amp 3\amp 9\amp 4\\ 0\amp 1\amp 2\amp 1\\ 0\amp 5\amp 10\amp 2\\ 0\amp0\amp0\amp0\ea\mapsto \ba{rrrr} 1\amp 3\amp 9\amp 4\\ 0\amp 1\amp 2\amp 1\\ 0\amp 0\amp 0\amp -3\\ 0\amp0\amp0\amp0\ea. \end{align*}
Thus the pivot columns are the first, second, and fourth, so we may take the first, second and fourth elements of \(S\) as a basis for \(W.\)

3.

Let \(A=\ba{rrr}1\amp2\amp3\\1\amp2\amp3\\1\amp2\amp3\ea\text{.}\)
(a)
Compute the rank and nullity of \(A\text{.}\)
Solution.
Too easy! It is obvious that the rank is 1 since all columns are multiples of the first. Rank-nullity tells us that the nullity is \(3-1=2.\)
(b)
Compute \(A\ba{r}1\\1\\1\ea\text{,}\) and use your answer to help conclude (without computing the characteristic polynomial) that \(A\) is diagonalizable.
Solution.
\(A\ba{r}1\\1\\1\ea= \ba{r}6\\6\\6\ea = 6\ba{r}1\\1\\1\ea,\) which means that 6 is a eigenvalue for \(A\text{,}\) and \(\ba{r}1\\1\\1\ea\) is an eigenvector.
The nullity is 2, which means that 0 is an eigenvalue and that the eigenspace corresponding to 0 (the nullspace of \(A\)) has dimension 2, so that there exists a basis of \(\R^3\) consisting of eigenvectors. Recall that by Proposition 4.4.6 the eigenvectors from different eigenspaces are linearly independent.
(c)
Determine the characteristic polynomial of \(A\) from what you have observed.
Solution.
\(\chi_A(x) = x^2 (x-6)\text{.}\) There are two eigenvalues, 0 and 6, and since the matrix is diagonalizable the algebraic multiplicities to which they occur equal their geometric multiplicities (i.e., the dimension of the corresponding eigenspaces), see Theorem 4.4.7.
(d)
Determine a matrix \(P\) so that
\begin{equation*} \ba{rrr}6\amp0\amp0\\0\amp0\amp0\\0\amp0\amp0\ea = P^{-1}AP. \end{equation*}
Solution.
We already know that \(\ba{r}1\\1\\1\ea\) is an eigenvector for the eigenvalue 6, and since 6 occurs as the first entry in the diagonal matrix, that should be the first column of \(P.\)
To find a basis of eigenvectors for the eigenvalue 0, we need to find the nullspace of \(A.\) It is immediate to see that the reduced row-echelon form of \(A\) is
\begin{equation*} R=\ba{rrr}1\amp2\amp3\\0\amp0\amp0\\0\amp0\amp0\ea\text{,} \end{equation*}
which tells us the solutions are
\begin{equation*} \ba{r}x_1\\x_2\\x_3\ea = \ba{c}-2x_2-3x_3\\x_2\\x_3\ea = x_2\ba{r}-2\\1\\0\ea+ x_3\ba{r}-3\\0\\1\ea. \end{equation*}
We may choose either of those vectors (or some linear combinations of them) to fill out the last columns of \(P.\) So one choice for \(P\) is
\begin{equation*} P = \ba{rrr}1\amp-2\amp-3\\1\amp1\amp0\\1\amp0\amp1\ea. \end{equation*}

4.

Let \(\cE_1=\{E_{11},E_{12},E_{21},E_{22}\} = \{ [\begin{smallmatrix} 1\amp0\\0\amp0 \end{smallmatrix}], [\begin{smallmatrix} 0\amp1\\0\amp0 \end{smallmatrix}], [\begin{smallmatrix} 0\amp0\\1\amp0 \end{smallmatrix}],[\begin{smallmatrix} 0\amp0\\0\amp1 \end{smallmatrix}]\}\) be the standard basis for \(M_2(\R)\text{,}\) and \(\cE_2=\{1,x,x^2,x^3\}\) the standard basis for \(\mathcal P_3(\R)\text{.}\) Let \(T:M_2(\R) \to \mathcal P_3(\R)\) be defined by
\begin{equation*} T([ \begin{smallmatrix} a\amp b\\c\amp d \end{smallmatrix}]) = 2a + (b-d)x -(a+c)x^2 + (a+b-c-d)x^3. \end{equation*}
(a)
Find the matrix of \(T\) with respect to the two bases: \([T]_{\cE_1}^{\cE_2}.\)
Solution.
The columns of the matrix \([T]_{\cE_1}^{\cE_2}\) are the coordinate vectors \([T(E_{ij})]_{\cE_2},\) so
\begin{equation*} [T]_{\cE_1}^{\cE_2} = \ba{rrrr}2\amp0\amp0\amp0\\ 0\amp1\amp0\amp-1\\-1\amp0\amp-1\amp0\\1\amp1\amp-1\amp-1\ea. \end{equation*}
(b)
Determine the rank and nullity of \(T.\)
Solution.
It is almost immediate that the first three columns of the matrix are pivot columns (think RREF), so the rank is at least three. Then we notice that the last column is a multiple of the second, which means the rank is at most three. Thus rank is 3 and nullity is 1.
(c)
Find a basis of the image of \(T.\)
Solution.
The first three columns of \([T]_{\cE_1}^{\cE_2}\) are a basis for the column space of the matrix, but we recall that they are coordinate vectors and the codomain is \(P_3(\R),\) so a basis for the image is:
\begin{equation*} \{2-x^2+x^3, x+x^3, -x^2 - x^3\}. \end{equation*}
(d)
Find a basis of the kernel of \(T.\)
Solution.
Since
\begin{equation*} T([ \begin{smallmatrix} a\amp b\\c\amp d \end{smallmatrix}]) = 2a + (b-d)x -(a+c)x^2 + (a+b-c-d)x^3, \end{equation*}
we must characterize all matrices which yield the zero polynomial. We quickly deduce we must have
\begin{equation*} a = c = 0,\text{ and } b=d, \end{equation*}
so one can choose \([\begin{smallmatrix} 0\amp1\\0\amp1 \end{smallmatrix}]\) as a basis for the kernel.

5.

Let \(V\) be a vector space with basis \(\cB=\{v_1, \dots, v_4\}.\) Define a linear transformation by
\begin{equation*} T(v_1) = v_2,\quad T(v_2)=v_3,\quad T(v_3) = v_4, \quad T(v_4) = av_1+bv_2+cv_3+dv_4. \end{equation*}
(a)
What is the matrix of \(T\) with respect to the basis \(\cB\text{?}\)
Solution.
\([T]_\cB=\ba{rrrr}0\amp0\amp 0\amp a\\ 1\amp 0\amp 0\amp b\\ 0\amp1\amp0\amp c\\0\amp0\amp1\amp d\ea.\)
(b)
Determine necessary and sufficient conditions on \(a,b,c,d\) so that \(T\) is invertible.
Hint.
What is the determinant of \(T\text{,}\) or what happens when you row reduce the matrix?
Solution.
The determinant of the matrix is \(-a\text{,}\) so \(T\) is invertible if and only if \(a\ne 0.\) The values of \(b,c,d\) do not matter.
(c)
What is the rank of \(T\) and how does the answer depend upon the values of \(a,b,c,d\text{?}\)
Solution.
With one elementary row operation, we reduce the original matrix to \(\ba{rrrr} 1\amp 0\amp 0\amp b\\ 0\amp1\amp0\amp c\\ 0\amp0\amp1\amp d\\ 0\amp0\amp 0\amp a\ea\) which is in echelon form. If \(a=0,\) the rank is 3, otherwise it is 4.

6.

Define a map \(T:M_{m\times n}(\R) \to \R^m\) as follows: For \(A = [a_{ij}] \in M_{m\times n}(\R),\) define \(T(A) = \ba{c}b_1\\b_2\\\vdots\\b_m\ea\) where \(b_k = \sum_{j=1}^n a_{kj},\) that is, \(b_k\) is the sum of all the elements in the \(k\)-th row of \(A.\) Assume that \(T\) is linear.
(a)
Find the rank and nullity of \(T.\)
Hint.
If you find this too abstract, try an example first, say with \(m=2\) and \(n=3.\) And finding the rank is the easier first step.
Solution.
Using the standard basis \(\{E_{ij}\}\) for \(M_{m\times n}(\R)\text{,}\) we see that \(T(E_{k1}) = e_k\) where \(\{e_1, \dots, e_m\}\) is the standard basis for \(\R^m.\) Since a spanning set for \(\R^m\) is in the image of \(T,\) the map must be surjective, which means the rank is \(m.\) By rank-nullity, the nullity is \(nm-m.\)
(b)
For \(m=2,\) and \(n=3\) find a basis for the nullspace of \(T.\)
Hint.
For an element to be in the nullspace, the sum of the entries in each of its rows needs to be zero. Can you make a basis with one row in each matrix all zero?
Solution.
Consider the set
\begin{equation*} \left\{\ba{rrr}1\amp 0\amp -1\\0\amp 0\amp0\ea, \ba{rrr}0\amp 1\amp -1\\0\amp 0\amp0\ea, \ba{rrr}0\amp 0\amp 0\\1\amp 0\amp -1\ea, \ba{rrr}0\amp0\amp 0\\0\amp 1\amp -1\ea\right\} \end{equation*}
Notice that the 1 which occurs in each matrix occurs in a different location in each matrix. It is now easy to show that any linear combination of these matrices which equals the zero matrix must have all coefficients equal to zero, so the set is linearly independent. Since it has the correct size, it must be a basis for the nullspace.

7.

This exercise is about how to deal with determining independent and spanning sets in vector spaces other than \(F^n.\) Let \(V=P_3(\R),\) the vector space of polynomials of degree at most 3 with real coefficients. Suppose that some process has handed you the set of polynomials
\begin{equation*} S=\{p_1=1+2x+3x^2+3x^3, p_2=5+6x+7x^2+8x^3, p_3=9+10x+11x^2+12x^3, p_4=13+14x+15x^2+16x^3\} \end{equation*}
We want to know whether \(S\) is a basis for \(V,\) or barring that extract a maximal linearly independent subset.
(a)
How can we translate this problem about polynomials into one about vectors in \(\R^n?\)
Solution.
We know that Theorem 4.1.5 tells us that \(P_3(\R)\) is isomorphic to \(\R^4,\) and all we need to do is map a basis to a basis, but we would like a little more information at our disposal.
Let \(\cB=\{1,x,x^2,x^3\}\) be the standard basis for \(V=P_3(\R).\) Then the map
\begin{equation*} T(v)= [v]_\cB \end{equation*}
which takes a vector \(v\) to its coordinate vector is such an isomorphism. What is important is that linear dependence relations among the vectors in \(S\) are automatically reflected in linear dependence relations among the coordinate vectors.
(b)
Determine a maximal linearly independent subset of \(S.\)
Solution.
If we record the coordinate vectors for the polynomials in \(S\) as columns of a matrix, we produce a matrix \(A\) and its RREF \(R\text{:}\)
\begin{equation*} A=\ba{rrrr}1 \amp 2 \amp 3 \amp 4 \\ 5 \amp 6 \amp 7 \amp 8 \\ 9 \amp 10 \amp 11 \amp 12 \\ 13 \amp 14 \amp 15 \amp 16\ea \mapsto R=\ba{rrrr} 1 \amp 0 \amp -1 \amp -2 \\ 0 \amp 1 \amp 2 \amp 3 \\ 0 \amp 0 \amp 0 \amp 0 \\ 0 \amp 0 \amp 0 \amp 0 \ea \end{equation*}
So we see that the first two columns are pivot columns which means \(S_0=\{p_1,p_2\}\) is a maximal linearly independent set.
We also recall that from the RREF, we can read off the linear dependencies with the other two vecotrs:
\begin{equation*} p_3=-p_1+2p_2 \text{ and } p_4 = -2p_1 + 3p_2. \end{equation*}
(c)
Extend the linearly independent set from the previous part to a basis for \(P_3(\R).\)
Solution.
Since we are free to add whatever vectors we want to the given set, we can add column vectors to the ones for \(p_1\) and \(p_2\) to see if we can extend the basis. We know that \(\{p_1, p_2, 1, x, x^2,x^3\}\) is a linearly dependent spanning set. We convert to coordinates and row reduce to find the pivots. So we build a matrix \(B\) and its RREF:
\begin{equation*} \ba{rrrrrr} 1 \amp 5 \amp 1 \amp 0 \amp 0 \amp 0 \\ 2 \amp 6 \amp 0 \amp 1 \amp 0 \amp 0 \\ 3 \amp 7 \amp 0 \amp 0 \amp 1 \amp 0 \\ 4 \amp 8 \amp 0 \amp 0 \amp 0 \amp 1 \ea \mapsto \ba{rrrrrr} 1 \amp 0 \amp 0 \amp 0 \amp -2 \amp \frac{7}{4} \\ 0 \amp 1 \amp 0 \amp 0 \amp 1 \amp -\frac{3}{4} \\ 0 \amp 0 \amp 1 \amp 0 \amp -3 \amp 2 \\ 0 \amp 0 \amp 0 \amp 1 \amp -2 \amp 1 \ea \end{equation*}
We see the first 4 columns are pivots, so we may take \(\{p_1, p_2, 1, x\}\) as one such basis.

8.

Let \(A\in M_5(\R)\) be the block matrix (with off diagonal blocks all zero) given by:
\begin{equation*} A = \ba{rrrrr} -1\amp 0\amp \\ \alpha\amp 2\\\amp \amp 3\amp 0\amp 0\\ \amp \amp \beta\amp 3\amp 0\\ \amp \amp 0\amp \gamma\amp 3 \ea. \end{equation*}
Determine all values of \(\alpha, \beta, \gamma\) for which \(A\) is diagonalizable.
Solution.
Since the matrix is lower triangular, it is easy to compute the characteristic polynomial:
\begin{equation*} \chi_A = (x+1)(x-2)(x-3)^3\text{.} \end{equation*}
The eigenspaces for \(\lambda = -1, 2\) each have dimension 1 (the required minimum) and equal to the algebraic multiplicity, so the only question is what happens with the eigenvalue \(\lambda = 3\text{.}\) Consider the matrix \(A-3I = \ba{rrrrr} -4\amp 0\amp \\ \alpha\amp -1\\\amp \amp 0\amp 0\amp 0\\ \amp \amp \beta\amp 0\amp 0\\ \amp \amp 0\amp \gamma\amp 0 \ea.\) For the nullspace of \(A-3I\) to have dimension 3, the rank must be 2. Clearly the first two rows are linearly independent (independent of \(\alpha\)), while if either \(\beta\) or \(\gamma\) is nonzero, this will increase the rank beyond two. So the answer is \(\alpha\) can be anything, but \(\beta\) and \(\gamma\) must both be zero.

9.

Let \(A=\ba{rrr}3\amp 0\amp 0\\6\amp -1\amp 6\\1\amp 0\amp 2\ea \in M_3(\R).\)
(a)
Find the characteristic polynomial of \(A.\)
Solution.
\(\chi_A = \det(xI-A) = \det\left(\ba{ccc}x-3\amp0\amp0\\-6\amp x+1\amp -6\\-1\amp 0\amp x-2\ea\right)\text{.}\) Expanding along the first row shows that \(\chi_A = (x-3)(x-2)(x+1).\)
(b)
Show that \(A\) is invertible.
Solution.
Many answers are possible: \(\det A = -6 \ne 0\text{,}\) or 0 is not an eigenvalue, or one could row reduce the matrix to the identity. All show \(A\) is invertible.
(c)
Justify that the columns of \(A\) form a basis for \(\R^3.\)
Solution.
Since \(A\) is invertible, the rank of \(A\) is 3, which is the dimension of the column space. So the column space spans all of \(\R^3,\) which means the columns must be linearly independent either by Theorem 3.1.6 or directly since the nullspace is trivial. Thus the columns form a basis.
(d)
Let \(\cB=\{v_1, v_2, v_3\}\) be the columns of \(A,\) and let \(\cE\) be the standard basis for \(\R^3.\) Suppose that \(T:\R^3 \to \R^3\) is a linear map for which \(A= [T]_\cE.\) Determine \([T]_\cB.\)
Solution.
We know that \([T]_\cB = Q^{-1} [T]_\cE Q\text{,}\) where \(Q=[I]_\cB^\cE\) is a change of basis matrix. But we see that \(Q=[I]_\cB^\cE = A\) by definition and since \([T]_\cE = A\) as well, we check that \([T]_\cB = Q^{-1} [T]_\cE Q = A^{-1} A A = A.\)