MATH 240

Wed. November 6th, 2019


We’re concerned with calculating eigenvalues.
Which matrices have the same eigenvalues?


Similar Matrices

Definition. Two n×nn\times n matrices AA and BB are similar if there is some invertible matrix PP such that A=PBP1A=PBP^{-1}.

If AA is similar to BB, then BB is similar to AA.
A=PBP1P1AP=P1PBP1PP1AP=BP1A(P1)1=B\begin{aligned} A&=PBP^{-1}\\ P^{-1}AP&=P^{-1}PBP^{-1}P\\ P^{-1}AP&=B\\ P^{-1}A(P^{-1})^{-1}&=B \end{aligned}


Theorem. If AA and BB are similar matrices, then AA and BB have the same characteristic equation, and thus have the same eigenvalues (with the same multiplicities).

Proof.
We need to show that det(AλI)=det(BλI)\text{det}(A-\lambda I)=\text{det}(B-\lambda I), as this would mean they have the same characteristic equation.
A=PBP1AλI=PBP1λI=PBP1λ(PP1)=PBP1P(λI)P1=P(BλI)P1det(AλI)=det(P(BλI)P1)=det(P) det(BλI) det(P1)=det(P) det(P1) det(BλI)=det(PP1) det(BλI)=det(I) det(BλI)=det(BλI).\begin{aligned} A&=PBP^{-1}\\ A-\lambda I&=PBP^{-1} - \lambda I\\ &=PBP^{-1} - \lambda (PP^{-1})\\ &=PBP^{-1} - P(\lambda I)P^{-1}\\ &=P(B-\lambda I)P^{-1}\\ \text{det}(A-\lambda I)&=\text{det}(P(B-\lambda I)P^{-1})\\ &=\text{det}(P)\ \text{det}(B-\lambda I)\ \text{det}(P^{-1})\\ &=\text{det}(P)\ \text{det}(P^{-1})\ \text{det}(B-\lambda I)\\ &=\text{det}(PP^{-1})\ \text{det}(B-\lambda I)\\ &=\text{det}(I)\ \text{det}(B-\lambda I)\\ &=\text{det}(B-\lambda I). \end{aligned} QED.

Note: Just because AA and BB have the same eigenvalues doesn’t mean they have the same eigenvectors.


Matrix Powers

AkA^k is the result of multiplying a square matrix AA by itself kk times. This turns out to have many useful applications, so we want a way of computing it for large values of kk that’s quicker than doing manual matrix multiplication.

If DD is a diagonal matrix, then DkD^k is the matrix formed by raising the entries on the diagonal of DD to the kkth power. This is a lot easier to compute naively than powers of non-diagonal matrices.


Diagonalizable Matrices

Definition. A matrix AA is said to be diagonalizable if AA is similar to a diagonal matrix.

A=PDP1A2=(PDP1)(PDP1)=PD(P1P)DP1=PDDP1=PD2P1A3=A2A=(PD2P1)(PDP1)=PD2(P1P)DP1=PD3P1...\begin{aligned} A&=PDP^{-1}\\\\ A^2&=(PDP^{-1})(PDP^{-1})\\ &=PD(P^{-1}P)DP^{-1}\\ &=PDDP^{-1}\\ &=PD^2P^{-1}\\\\ A^3&=A^2A\\ &=(PD^2P^{-1})(PDP^{-1})\\ &=PD^2(P^{-1}P)DP^{-1}\\ &=PD^3P^{-1}\\ ... \end{aligned} By induction, Ak=PDkP1A^k=PD^kP^{-1}, where AA is a diagonalizable matrix and DD is the diagonal matrix it’s similar to.


Theorem: An n×nn\times n matrix AA is diagonalizable if and only if AA has nn linearly independent eigenvectors.

Theorem: An n×nn\times n matrix with nn distinct eigenvalues is diagonalizable. (This is from another theorem stating that eigenvectors that correspond to distinct eigenvalues are linearly independent.)

In order to determine if AA is diagonalizable, we can find the eigenvalues of AA by solving det(AλI)=0\text{det}(A-\lambda I)=0 for λ\lambda, then determining whether or not they are distinct. If they are, then AA is definitely diagonalizable. Otherwise, we have to find eigenvectors for each eigenvalue and see if they are linearly independent. In this way, determining if a matrix is diagonalizable is now a completely algorithmic process.


Ex. Is this matrix diagonalizable?
(5034072100010004)\begin{pmatrix} 5&0&3&4\\0&7&2&1\\0&0&0&1\\0&0&0&4 \end{pmatrix} Yes, because its eigenvalues are just the entries on its diagonal, which there are four of, and they are all distinct.

(Note that this matrix is non-invertible – non-invertible matrices can be diagonalizable.)