How can you prove a matrix has the same eigenvalues as its transpose without using determinants?

The eigenvalues of $M$ are the numbers $\lambda$ such that $M - \lambda I$ has non-trivial kernel, i.e. the numbers for which the square matrix $M-\lambda I$ cannot be inverted. But $$(M-\lambda I)^T = M^T - \lambda I^T = M^T-\lambda I$$

and so if $M - \lambda I$ is not invertible, $(M-\lambda I)^T$ is not invertible either, as a square matrix is invertible if and only if its transpose is (thanks to Marc for the suggestions!). But the above shows that $\lambda$ is also an eigenvalue for $M^T$, as we wanted to show.


The eigenvalues of a square matrix $A$ are the roots of its minimal polynomial. Since for any polynomial $P\in K[X]$ the matrix $P[A^\top]$ is the transpose of $P[A]$, and one of these is zero if and only if the other is, the matrices $A$ and its transpose $A^\top$ have the same minimal polynomials. Therefore they have the same eigenvalues.

Or even simpler, $\lambda$ is an eigenvalue of $A$ if and only if $A-\lambda I$ fails to be invertible, which happens if and only if it transpose $A^\top-\lambda I$ fails to be invertible (because clearly if some square matrix $M$ has an inverse, the transpose of that inverse will be an inverse of $M^\top$, and vice versa).