Geometric argument that operators on $\mathbb{R}^3$ have an eigenvalue?
This question came up when trying to trying to find a $3\times3$ real matrix $A$ such that
- $Ax$ is nonzero for nonzero $x$
- $Ax$ is orthogonal to $x$ for any $x$ in $\mathbb{R}^3$
We know such a matrix cannot exist because $A$ must have an eigenvalue (thus there is some $x$ such that either $Ax = 0$ or $Ax$ is parallel to $x$)
However,
Is there a nice, purely geometric way to justify that every operator on $\mathbb{R}^3 $ has a (real) eigenvalue?
To clarify: I'm looking an intuitive way to visualize why an eigenvalue must exist in this case. In particular, no polynomials and no determinants are allowed!
If $A\colon\mathbb{R}^3\to\mathbb{R}^3$ was a linear map with no eigenvectors, then $x\mapsto Ax/\Vert Ax\Vert$ would give a map on the unit sphere with no fixed points and without taking any point to its antipode, contradicting the hairy ball theorem.
Just using linear algebra but no determinants, the closer to what you ask that I know uses that polynomials of degree 3 with real coefficients have roots:
Let's argue by contradiction, and assume that $A$ has no eigenvalues. First, I claim that any $v$ is contained in a proper $A$-invariant subspace of ${\mathbb R}^3$. This is clear if $v=0$. Otherwise, consider $v,Av,A^2v,A^3v$. They are linearly dependent, so there are real coefficients $a,b,c,d$ not all zero with $(aA^3+bA^2+cA+dI)v=0$.
Now we examine the polynomial $p(x)=ax^3+bx^2+cx+d$. If $a\ne0$, $p$ has a real root $r$, and we can write $p(x)=(x-r)(ax^2+ex+f)$ for some real numbers $e,f$. This gives us $(A-rI)(aA^2+eA+fI)v=0$, but $A-rI$ is invertible (since $r$ is not an eigenvalue of $A$), so in fact $(aA^2+eA+fI)v=0$. If $a=0$, then directly we have $(bA^2+cA+dI)v=0$.
This shows that $v$ is contained in a proper $A$-invariant subspace (consider the span $S$ of $v,AV$, and note that the argument above shows that either $A^2v$ is in this span, from which it follows that $S$ is $A$-invariant, or else, the coefficient of $A^2$ above is 0, and in fact $v$ is an eigenvalue, a contradiction).
So we may assume that any non-zero $v$ is in an $A$-invariant plane $P_v$. Now, if $w$ is a vector not in $P_v$, then any vector in $P_v\cap P_w$ is mapped by $A$ to another vector in the same line, so $P_v\cap P_w$ is an $A$-invariant line, i.e., $A$ has a real eigenvalue after all.
(Note that I presented the argument as a contradiction for brevity, but it can be rearranged as a direct proof.)
David Milovich found a nice way of extending this argument so we have a short nice "determinant free" proof that $n\times n$ matrices with real entries and $n$ odd admit a real eigenvalue, see this blog post of mine.
I was interested in this because this is the base case of a nice inductive argument (that actually can be traced back to one of Gauss' first proofs of the fundamental theorem of algebra) that allows us to show that any square matrix with real coefficients admits a (perhaps complex) eigenvalue, from which we can deduce the fundamental theorem of algebra. I refer to this in the post above, but it comes from "The fundamental theorem of algebra and linear algebra" by Harm Derksen, American Mathematical Monthly, 110 (7) (2003), 620-623. (The issue is that in that paper, the odd-dimensional case is done by appealing to determinants.)