Section 4.8 is about exponentials of matrices. At first sight this seems
absurd. How can you multiply $e$ times itself a matrix number of times?
The naive definition of exponentials doesn't work for matrices. But then again,
it doesn't work for complex numbers, either. It doesn't work for irrational
real numbers. It doesn't even work for negative integers!
In the first video, we work our way up from exponentials of positive
integers to exponentials of real and complex numbers. The key jump involves
calculus. If $a$ is a real number, we define $f(t)=e^{at}$ to be
The unique function with $f'(t)=af(t)$ and $f(0)=1$, or equivalently
$e^{at}= \sum_{n=0}^\infty \frac{a^nt^n}{n!}$
The same definitions work for imaginary exponents: $f(t)=e^{ibt}$ is:
The unique function with $f'(t)=ibf(t)$ and $f(0)=1$, or equivalently
$e^{ibt}= \sum_{n=0}^\infty \frac{i^nb^nt^n}{n!}$
Either way, we get $e^{ibt} = \cos(bt) + i \sin(bt)$. This is a unit complex
number that rotates around at rate $b$.
The same definitions work for complex exponents: $f(t)=e^{(a+ib)t}$ is:
The unique function with $f'(t)=(a+ib)f(t)$ and $f(0)=1$, or equivalently
Either way, we get $e^{(a+ib)t} = a^{at}(\cos(bt) + i \sin(bt))$.
This is a complex number that grows in size like $e^{at}$ and
rotates around at rate $b$.
Now for matrices:
The same definitions still work! If $A$ is a square matrix, then
$F(t)=e^{At}$ is:
The unique function with $F'(t)=AF(t)$ and $F(0)=I$, or equivalently
$e^{At}= \sum_{n=0}^\infty \frac{A^n t^n}{n!}$
This is a matrix with the same eigenvectors as $A$, and whose eigenvalues
are $e^{\lambda_i t}$, where $\{\lambda_i\}$ are the eigenvalues of $A$.
Some of the $\lambda$'s may be complex, but that's OK, since just learned how
to take exponentials of complex numbers!