Do all square matrices have eigenvectors?

I came across a video lecture in which the professor stated that there may or may not be any eigenvectors for a given linear transformation.

But I had previously thought every square matrix has eigenvectors.


It depends over what field we're working. For example, the real matrix

$$A=\begin{pmatrix}0&\!\!-1\\1&0\end{pmatrix}$$

has no eigenvalues at all (i.e., over $\;\Bbb R\;$ ), yet the very same matrix defined over the complex field $\;\Bbb C\;$ has two eigenvalues: $\;\pm i\;$


Over an algebraically closed field, every square matrix has an eigenvalue. For instance, every complex matrix has an eigenvalue. Every real matrix has an eigenvalue, but it may be complex.

In fact, a field $K$ is algebraically closed iff every matrix with entries in $K$ has an eigenvalue. You can use the companion matrix to prove one direction. In particular, the existence of eigenvalues for complex matrices is equivalent to the fundamental theorem of algebra.


No, but you can build some.

A matrix in a given field (or even commutative ring) may or may not have eigenvectors. It has eigenvectors if and only if it has eigenvalues, by definition. The Cayley-Hamilton theorem provides an easy characterization of whether a matrix has eigenvalues: the eigenvalues are exactly the roots of the characteristic polynomial. Thus a matrix has eigenvectors if and only if the characteristic polynomial has at least one root. For example, the following matrix over $\mathbb{R}$ has no eigenvectors, because its characteristic polynomial $X^2+1$ has no real root: $$ \begin{pmatrix} 0 & -1 \\ 1 & 0 \\ \end{pmatrix} $$ This is a rotation matrix: it represents a planar transformation that transforms any vector into a vector that makes a specific angle with the original (a right angle, in this case), and in particular the result cannot possibly be parallel with the original.

Thus it is certainly possible for a matrix not to have any eigenvectors. However, given a matrix over a field, it is possible to construct a larger field in which the matrix has eigenvectors. Any extension field in which the characteristic polynomial has at least one root will do. In particular, in an algebraically closed field such as $\mathbb{C}$, every matrix has at least one eigenvalue and therefore has eigenvectors. For example, the matrix above, when taken as a matrix over $\mathbb{C}$, has the eigenvalues $i$ and $-i$ and eigenvectors of the form $\{(\pm i z,z) \mid z\in\mathbb{C}\}$.


Take a look at the matrix $$A=\begin{bmatrix}0 &1\\-1 & 0\end{bmatrix}.$$

This matrix has characteristic polynimial of $p(\lambda) = \lambda^2 + 1$, meaning that for real $\lambda$, every matrix $A-\lambda I$ has rank $2$, therefore the matrix has no eigenvectors.


The question is equivalent to asking if all polynomials have roots.

Every square matrix has a characteristic polynomial. In the domain of real numbers, not every polynomial has real roots and so not every matrix has an eigenvalue, eigenvector pair. In the domain of complex numbers, every polynomial has at least one root (indeed, that's why complex numbers where originally used), so every matrix has at least one eigen-pair.