The eigenvalues of a rotation matrix are complex numbers. I understand that they cannot be real numbers because when you rotate something no direction stays the same.

My question
What is the intuition that the eigenvalues are complex? Why do they exist at all for rotation matrices? I mean it is not so that every time a calculation is not possible the result is complex (dividing by 0 is not possible at all - the result is not real, but it is not complex either!). The complex numbers seem to cover some middle ground here, but I don't understand how and why they come into play - there don't seem to be any square roots taken from negative numbers...)


Solution 1:

Heuristically, I suppose we could see this being because the standard action of the complex numbers on $V = \mathbb R^{2n}$ is by rotation. That is, $(e_1, \ldots, e_{2n})$ is a basis for $V$, then we define multiplication by $i$ as

$ i e_{2k-1} = e_{2k}, \quad i e_{2k} = - e_{2k-1},$ for $k = 1, \ldots, n$,

so multiplying a vector $v$ by a complex number $\lambda$ will correspond to a scaling by a real and a rotation.

Now, if we have a rotation $A$ on the space $V$ and we want to find a line $l$ "invariant" under $A$, then we can try to look for a complex number $\lambda$ such that rotation of $l$ under $A$ is equivalent to the action of $\lambda$ on $l$. Thus, we can try and look for complex eigenvalues $\lambda$ of $A$.

This line of "reasoning" completely breaks down in the odd-dimensional case, because we can't define a complex structure on odd-dimensional spaces, but it might give a hint into why we'd look for complex eigenvalues at all. Then it becomes a matter of algebra to figure out that it actually works, and that it does so in a vector space of any finite dimension.

Finally, for the existence, as Alex already pointed out, we look for eigenvalues by finding roots of polynomials. All polynomials admit a root over the complex numbers, which translates into the existence of a complex eigenvalue.