Are the eigenvalues of $AB$ equal to the eigenvalues of $BA$? (Citation needed!)
Solution 1:
If $v$ is an eigenvector of $AB$ for some nonzero $\lambda$, then $Bv\ne0$ and $$\lambda Bv=B(ABv)=(BA)Bv,$$ so $Bv$ is an eigenvector for $BA$ with the same eigenvalue. If $0$ is an eigenvalue of $AB$ then $0=\det(AB)=\det(A)\det(B)=\det(BA)$ so $0$ is also an eigenvalue of $BA$.
More generally, Jacobson's lemma in operator theory states that for any two bounded operators $A$ and $B$ acting on a Hilbert space $H$ (or more generally, for any two elements of a Banach algebra), the non-zero points of the spectrum of $AB$ coincide with those of the spectrum of $BA$.
Solution 2:
It is true that the eigenvalues (counting multiplicity) of $AB$ are the same as those of $BA$.
This is a corollary of Theorem 1.3.22 in the second edition of "Matrix Analysis" by Horn and Johnson, which is Theorem 1.3.20 in the first edition.
Paraphrasing from the cited Theorem: If $A$ is an $m$ by $n$ matrix and $B$ is an $n$ by $m$ matrix with $n \geq m$ then the characteristic polynomial $p_{BA}$ of $BA$ is related to the characteristic polynomial $p_{AB}$ of $AB$ by $$p_{BA}(t) = t^{n-m} p_{AB}(t).$$
In your case, $n = m$, so $p_{BA} = p_{AB}$ and it follows that the eigenvalues (counting multiplicity) of $AB$ and $BA$ are the same.
You can see Horn and Johnson's proof in the Google Books link above. A similar proof was given in this answer from Maisam Hedyelloo.
Solution 3:
Here is an alternative proof for this result, following Exercises 6.2.8-9 of Hoffman & Kunze's Linear Algebra (p. 190):
Lemma: Let $A,B\in M_n(\mathbb{F})$, where $\mathbb{F}$ is an arbitrary field. If $I-AB$ is invertible, then so is $I-BA$, and
$$(I-BA)^{-1}=I+B(I-AB)^{-1}A.$$
Proof of Lemma: Since $I-AB$ is invertible,
\begin{align} &I=(I-AB)(I-AB)^{-1}=(I-AB)^{-1}-AB(I-AB)^{-1}\\ &\implies (I-AB)^{-1} = I+ AB(I-AB)^{-1}. \end{align}
Then we have
\begin{align} I+B(I-AB)^{-1}A&= I+B[I+ AB(I-AB)^{-1}]A= I+BA+BAB(I-AB)^{-1}A\\ \implies I&=I+B(I-AB)^{-1}A-BA-BAB(I-AB)^{-1}A\\ &=I[I+B(I-AB)^{-1}A]-BA[I+B(I-AB)^{-1}A]\\ &=(I-BA)[I+B(I-AB)^{-1}A].\checkmark. \end{align}
Proposition: $\forall A,B\in M_n(\mathbb{F}):$ $AB$ and $BA$ have the same eigenvalues.
Proof: Let $\alpha\in\mathbb{F}$ be an eigenvalue of $AB$. If $\alpha=0$, then $0=\det(0I-AB)=\det(-A)\det(B)=\det(B)\det(-A)=\det(0I-BA)$ and so $0$ is an eigenvalue of $BA$ also.
Otherwise $\alpha\neq0$. Suppose $\alpha$ is not an eigenvalue of $BA$. Then $0\neq\det(\alpha I-BA)=\alpha^n\det(I-(\frac{1}{\alpha}B)A)$. Then $0\neq\det(I-(\frac{1}{\alpha}B)A),$ so that $I-(\frac{1}{\alpha}B)A$ is invertible. By the lemma above we know that $I-A(\frac{1}{\alpha}B)$ is invertible as well, meaning $0\neq\det(I-A(\frac{1}{\alpha}B))=\det(I-\frac{1}{\alpha}AB) \implies 0\neq\det(\alpha I-AB)$. But we assumed $\alpha$ to be an eigenvalue for $AB$, $\unicode{x21af}$.
Solution 4:
Notice that $\lambda$ being an eigenvalue of $AB$ implies that $\det(AB-\lambda I)=0$ which implies that $$\det(A^{-1})\det(AB-\lambda I)\det(B^{-1})=0=\det(A^{-1}(AB-\lambda I)B^{-1})=\det((B-\lambda A^{-1})B^{-1}) $$ $$=\det(I-\lambda A^{-1}B^{-1}) = 0.$$ This further implies that $$\det(BA)\det(I-\lambda A^{-1}B^{-1})=\det(BA(I-\lambda A^{-1}B^{-1}))=\det(BA-\lambda I)=0,$$ i.e., $\lambda$ is an eigenvalue of $BA$. This proof holds only for invertible matrices $A$ and $B$ though. For singular matrices you can show that $0$ is a common eigenvalue, but I can't think of a way to show that the rest of the eigenvalues are equal.