L4: Eigenvectors and Eigenvalues#

As I said before, Eigenvectors are thoes vectors that do not change direction when a linear transformation \(A\) is applied to them. Formally, the eigenvector \(x\) of \(A\) is a vector such that \(Ax = \lambda x\) for some scalar \(\lambda\). If we allied the \(A\) to \(Av\) again, what will happen? The \(x\) will be scaled again, but the direction maintains. Thus, \(A^2x=\lambda^2x\).

Important

If \(A\) is \(n\times n\) matrix with \(n\) indepdent eigenvectors, then any vector \(v\) can be considered as the linear combination of eigenvectors with some coefficients \(c\in\mathbb{R}^n\), i.e., \(v=Xc\). Thus,

\[\begin{split}Av=AXc=\Lambda Xc \\\Longrightarrow A^kv=A^kXc=\Lambda^k Xc\end{split}\]

Four properties of eigenvectors#

Hide code cell source
from IPython.display import Image
Image('imgs/l4-m1.png')
../_images/7aa357d83a8351101df9a6550fec8b5a9b83761f0c57627625a97f5b9028adbb.png

For the determinant party, recall the definition of the determinant of a matrix \(A\), I will just show you a beautiful picture from 3blue1brown.

Hide code cell source
from IPython.display import Image
Image('imgs/l4-m2.png', width=800)
../_images/564e70ef0e9014ac5b3b8968618be20c0b62c3c8affc47e0efe221c3d5c7b097.png

Four more conlusions about eigenvectors#

Hide code cell source
from IPython.display import Image
Image('imgs/l4-m3.png', width=800)
../_images/b3fb1e6a297d57c5ac2b65d84b610aa1237fa51db7268b394225ae07a54bd710.png

The last one tells us an important fact that only the real symmetric matrix has orthogonal eigenvectors. In fact, I am not sure that in what kind of case that a asymmetric matrix \(A\) also satisfies the \(A^TA=AA^T\).

Conclusion3: similar matrices have the same eigenvalues#

Hide code cell source
Image('imgs/l4-m4.png', width=800)
../_images/3352d627b52c7f1534a78f3835a2559f6bfd15256c2a32d7d38bc683e870d338.png

Conclusion4: the diagonal of the triangular matrix is the eigenvalues#

Hide code cell source
Image('imgs/l4-m5.png', width=800)
../_images/799d054517170de75b7afcbc0a0562b0b987cd30644d1cb96cfa4d881c538c7b.png

Conclusion5: diagonalization can fasten the computation of power#

Assume a matrix \(A\) has \(n\) independent eigenvectors. Then, we can diagomialize it as \(A=X\Lambda X^{-1}\). Thus,

\[A^k=(X\Lambda X^{-1})...(X\Lambda X^{-1})=X\Lambda^k X^{-1}\]

Conclusion6: markov matrix will be stable after a long time

Hide code cell source
Image('imgs/l4-m6.png', width=800)
../_images/4c3d653b2ef6ff0ed55765cbcca3804f59d4df61710645da9a66bb2f0190932d.png

For the introduction of Multiplicity#

Refer to the Section 1.6 of the book.