Chapter 4 Review of Core Topics
In this chapter we give a quick summary of core topics in a standard linear algebra course up to the introduction of inner product spaces. Details should have given in your course, but perhaps this review will offer a slightly different perspective or an interesting example. Some of the topics mentioned are more advanced such as minimal polynomials or rational and Jordan canonical forms, so if you haven’t seen them, don’t worry.
It is useful to keep in mind a couple of overarching goals of linear algebra (the study of vector spaces and linear maps). The first is solving the problem of classification: when are two vector spaces “the same”, meaning indistinguishable as vector spaces? Even the question is probably confusing, so let’s foreshadow some of the topics and ideas in this chapter and how they bear on this question.
Mathematicians use the technical term isomorphism to describe when two objects (in our case vector spaces) are (essentially) “the same.” It is important to understand that this perspective of sameness is viewed through the lens of saying that once an identification (or bijective mapping) is made between the two spaces, every vector space property that one has is present in the other. For finite-dimensional vector spaces, we have Theorem 4.1.5 which says that two finite-dimensional vector spaces \(V\) and \(W\) defined over the same field \(F\) are isomorphic if and only if \(\dim V = \dim W.\)
Let’s take a first pass at that remarkable statement. It says, in particular, that
\begin{equation*}
M_{2\times 2}(\R), P_3(\R),\text{ and }
\R^4
\end{equation*}
are all isomorphic as vector spaces over \(\R\) simply because each space has dimension 4. This can be quite confusing when you first see it. A typical reaction might be, they are not the same. After all one can’t multiply two vectors in \(\R^4\) and get another vector in \(\R^4 \text{,}\) but I can multiply two matrices in \(M_{2\times 2}(\R)\) and get another element in the set. I can multiply two elements of \(P_3(\R),\) but most likely their product will not be in \(P_3(\R).\) Those three sets are definitely not the same. Right, but nobody said they were. They are isomorphic, that is “the same” when viewed through the lens of linear algebra.
Everyone can probably write down a one-to-one correspondence (bijection) between the elements of each set. Four entries in a matrix map naturally to a 4-tuple and the four coefficients of a polynomial of degree at most 3 also map naturally to a 4-tuple. Once those identifications are made the vector space operations of one translate exactly to those of another. This is the notion of isomorphism which is at the heart of classification.
Another major goal of this part is to understand a fixed linear transformation \(T:V \to V\) between the same vector spaces. As you know and we shall review, by fixing a basis of the vector space, one can associate a matrix to \(T.\) For some choices of basis, the matrix may be very complicated; for others very simple such as a diagonal matrix, or still for others something in between, like a block-diagonal matrix.
All of the topics needed to address these questions/goals are reviewed in this chapter.