Videos for Section 4.6


Section 4.6 is about a series of tricks for finding eigenvalues and sometimes eigenvectors, but mostly just eigenvalues. Once you know the eigenvalues you can find the eigenvectors by row reduction.:

1. The trace of a matrix is the sum of its diagonal entries. This has the property that $Tr(AB)=Tr(BA)$ for any two square matrices $A$ and $D$. If $A=PDP^{-1}$, then $$Tr(A) = Tr((PD)P^{-1}) = Tr (P^{-1}(PD))=Tr(D),$$ so the trace is the sum of the eigenvalues.

2. The determinant of a matrix isn't nearly as easy to compute as the trace, but it's almost as useful. It also has the property that $\det(AB)=\det(BA)$, so by the same argument as with traces, the determinant equals the product of the eigenvalues. The first video shows how to use traces and determinants to help compute eigenvalues.

3. Scaling a matrix, or adding a multiple of the identity to a matrix, doesn't change the eigenvectors, and does obvious things to the eigenvalues. The eigenvalues of $7A$ are 7 times the eigenvalues of $A$. The eigenvalues of $A + 5I$ are 5 more than the eigenvalues of $A$. The second video shows how to make use of this.

4. A matrix $M$ is called block triangular if it can be partitioned into blocks $M = \begin{pmatrix} A & B \cr C & D \end{pmatrix}$ with $A$ and $D$ square and $B$ and $C$ rectangular, and with either $C=0$ (which we call upper block triangular) or with $B=0$ (lower block triangular). If both $B$ and $C$ are zero, then we say that $M$ is block diagonal. In all of these cases, we have:
i. $p_M(\lambda) = p_A(\lambda) p_D(\lambda)$ and
ii. The eigenvalues of $M$ are the eigenvalues of $A$ and the eigenvalues of $D$.
(Note: my use of the letters $A,B,C,D$ here is a little different from that of the book.)
If $M$ is block diagonal, we can get the eigenvectors of $M$ easily from the eigenvectors of $A$ and $D$. If $M$ is only block triangular, then some of the eigenvectors are easy to find, while others are harder. The third video is all about block triangular and block diagonal matrices.

5. A row sum is the sum of the entries in a row. If all of the row sums are the same, or if all of the column sums are the same, then this common value is an eigenvalue. This comes up frequently in probability, and is explained in the last video.