left and right eigenvalues

On the Stochastic Matrices article in Wikipedia there's a claim that left and right eigenvalues of a square matrix are the same. I tried looking this up, but can't find an explanation, only for hermitian matrices with real eigenvalues. Is it correct?


Solution 1:

The left eigenvalues of a matrix are the zeroes of its minimal polynomial.

The right eigenvalues of a matrix are the zeroes of its minimal polynomial.

Solution 2:

In essence, it's just saying that

If A is a square matrix, then its eigenvalues are equal to the eigenvalues of its transpose $A^T$, since they share the same characteristic polynomial, i.e. $det(\lambda I-A) = det(\lambda I-A^T)$.

To see this, let $v^TA=\gamma v^T$ where $(\gamma,v)$ are a pair of left eigenvalue and eigenvector. Since $v^TA=\gamma v^T \iff A^Tv=\gamma v$, $\gamma$ is also one right eigenvalue of $A^T$. As stated above, eigenvalues of a square matrix and its transpose are the same. Therefore, the left and right eigenvalues of a square matrix are the same.

Solution 3:

It can also be proved as following:

$AV=V \Lambda$, where $V$ has its column vectors as the right eigenvectors of $A$ and $\Lambda$ contains the (right) eigenvalues of $A$ on its diagonal. Multiply both sides by $V^{-1}$ on the right end, we have $A=V\Lambda V^{-1}$. Then multiply both sides by $V^{-1}$ again on the left end, we have $V^{-1}A=\Lambda V^{-1}$. The rows of $V^{-1}$ actually contains the left eigenvectors of $A$. We thus show that the eigenvalues for left and right eigenvectors are the same.