How Do I Compute the Eigenvalues of a Small Matrix?

If I have a $2\times 2$ or $3\times 3$ matrix, how should I go about computing the eigenvalues and eigenvectors of the matrix?

NB: I am making this question to provide a unified answer to questions about eigenvalues of small matrices so that all of the specific examples that come up can be marked as duplicates of this post. See here.


Solution 1:

Here's a cool way to compute eigenvalues and eigenvectors of matrices. Unfortunately, it requires solving a $n$ degree polynomial, where the matrix is $n\times n$ so it's not suited for large matrices, but for many problems it is sufficient. Additionally, there are some conditions on the matrix that make it doable for larger matrices.

Let's suppose you have a matrix $A$ over some field, $F$. When $v\neq 0$ is an eigenvector, $v$ satisfies $Av=\lambda v$ for some $\lambda\in F$, with $\lambda\neq 0$. Thus $Av-\lambda v = 0$ where $0$ is the zero matrix, so $(A-\lambda I)v = 0$. If $\det(A-\lambda I)\neq 0$, then it would be invertible. Multiplying both sides by the inverse gives $v=0$, so for eigenvectors we are going to have a determinant of $0$.

By considering $\lambda$ as a variable, we can take the determinant and produce a polynomial of degree $n$ over $F$ which is known as the characteristic polynomial of the matrix $A$. It is commonly denoted $p_A(\lambda)$. This polynomial has several interesting properties, but what is relevant to us is that its zeros are exactly the eigenvalues of $A$. For small cases, this gives us a surefire way to find the eigenvalues of a matrix. For larger matrices, this polynomial is not necessarily solvable, but still worth looking at, as some of its roots might be obvious. Additionally, under some circumstances, it will have solutions that we can solve for.

Once we have obtained however many eigenvalues as we are able to compute, $\{\lambda_1,\dots,\lambda_m\}$ we can then directly find the corresponding eigenvectors by looking at the equation $Av=\lambda_i v$. This gives rise to a system of equations that has infinitely many solutions (as a scalar multiple of an eigenvector is an eigenvector), but all of them are eigenvalues of $A$ corresponding to $v$.

The reason why this approach doesn't work in general is that it's not always possible to algebraically solve polynomials of large degree.


Here's an example computation (taken from wikipedia).

The eigenvectors, $v$, of $A= \begin{bmatrix} 2 & 0 & 1\\0 & 2 & 0\\ 1 & 0 & 2\end{bmatrix}$, satisfies the equation $(A-\lambda I)\mathbf{v}=0$. This means that $$\det\left(\begin{bmatrix} 2-\lambda & 0 & 1\\0 & 2-\lambda & 0\\ 1 & 0 & 2-\lambda\end{bmatrix}\right)=0$$ or that $0=6-11\lambda+6\lambda^2-\lambda^3$. Thus we have that the characteristic polynomial is $p_A(\lambda)=\lambda^3-6\lambda^2+11\lambda-6$. The solutions to this polynomial are $\{1,2,3\}$, so those are the eiganvalues of $A$. They give rise to the eigenvectors $(1,0,-1),(0,1,0),$ and $(1,0,1)$ respectively.