A simple explanation of eigenvectors and eigenvalues with 'big picture' ideas of why on earth they matter

To understand why you encounter eigenvalues/eigenvectors everywhere, you must first understand why you encounter matrices and vectors everywhere.

In a vast number of situations, the objects you study and the stuff you can do with them relate to vectors and linear transformations, which are represented as matrices.

So, in many many interesting situations, important relations are expressed as $$\vec{y} = M \vec{x}$$ where $\vec{y}$ and $\vec{x}$ are vectors and $M$ is a matrix. This ranges from systems of linear equations you have to solve (which occurs virtually everywhere in science and engineering) to more sophisticated engineering problems (finite element simulations). It also is the foundation for (a lot of) quantum mechanics. It is further used to describe the typical geometric transformations you can do with vector graphics and 3D graphics in computer games.

Now, it is generally not straight forward to look at some matrix $M$ and immediately tell what it is going to do when you multiply it with some vector $\vec{x}$. Also, in the study of iterative algorithms you need to know something about higher powers of the matrix $M$, i.e. $M^k = M \cdot M \cdot ... M$, $k$ times. This is a bit awkward and costly to compute in a naive fashion.

For a lot of matrices, you can find special vectors with a very simple relationship between the vector $\vec{x}$ itself, and the vector $\vec{y} = Mx$. For example, if you look at the matrix $\left( \begin{array}{cc} 0 & 1 \\ 1 & 0\end{array}\right)$, you see that the vector $\left(\begin{array}{c} 1\\ 1\end{array}\right)$ when multiplied with the matrix will just give you that vector again!

For such a vector, it is very easy to see what $M\vec{x}$ looks like, and even what $M^k \vec{x}$ looks like, since, obviously, repeated application won't change it.

This observation is generalized by the concept of eigenvectors. An eigenvector of a matrix $M$ is any vector $\vec{x}$ that only gets scaled (i.e. just multiplied by a number) when multiplied with $M$. Formally, $$M\vec{x} = \lambda \vec{x}$$ for some number $\lambda$ (real or complex depending on the matrices you are looking at).

So, if your matrix $M$ describes a system of some sort, the eigenvectors are those vectors that, when they go through the system, are changed in a very easy way. If $M$, for example, describes geometric operations, then $M$ could, in principle, stretch and rotate your vectors. But eigenvectors only get stretched, not rotated.

The next important concept is that of an eigenbasis. By choosing a different basis for your vector space, you can alter the appearance of the matrix $M$ in that basis. Simply speaking, the $i$-th column of $M$ tells you what the $i$-th basis vector multiplied with $M$ would look like. If all your basis vectors are also eigenvectors, then it is not hard to see that the matrix $M$ is diagonal. Diagonal matrices are a welcome sight, because they are really easy to deal with: Matrix-vector and Matrix-matrix multiplication becomes very efficient, and computing the $k$-th power of a diagonal matrix is also trivial.

I think for a "broad" introduction this might suffice?


This made it clearer for me: Khan Academy - Introduction to Eigenvalues and Eigenvectors I often find it easier to understand via illustration like this.


This video has a good simple explanation that also includes math. [2016, 3Blue1Brown, "Eigenvectors and eigenvalues | Essence of linear algebra, chapter 14"]

My very loose interpretation of the 'meaning' from that video is as follows:

Eigenvectors can be visualized as a line that's an axis of rotation, where no matter how much things change around it, the angle / direction of this line doesn't change.

I visualize it first like a fan, where regardless of how much the blades spin, its centerpoint ("Eigenvector") is still pointed in the same direction.

In a matrix that involves a transformation, it can also be thought of as a kind of rotation, but one where the rotation can change the lengths of vectors within a zone of rotation.

After a transformation, there can still be one or more such lengths that don't change in direction (Eigenvectors), but will change in size. The change in size would be the "Eigenvalue", like an Eigenvalue of 2 indicates that the length of the Eigenvector had doubled after transformation and 1/2 would mean it had reduced by half.

The picture below, based on that video link, would be an illustration of the above explanation.

Illustrated explanation of Eigenvectors and Eigenvalues