You could think of eigenvectors as giving you the coordinate system in which your problem has the simplest form. It's analogous to working physical problems by turning your coordinate system the right way to make some term go away.


Suppose $\sum a_n y^{(n)} = 0$ is a homogeneous linear differential equation with constant coefficients. Let $D$ be the linear operator defined on, say, the space $C^{\infty}(\mathbb{R})$ of smooth functions $\mathbb{R} \to \mathbb{R}$ which sends a function $y$ to its derivative $y'$. Then the solutions to the above differential equation are the nullspace of the operator $\sum a_n D^n$.

It turns out that this nullspace is finite-dimensional, so we can use finite-dimensional linear algebra on it. It is not just a vector space, though: it comes equipped with an action of $D$, which commutes with $\sum a_n D^n$, and which therefore splits the nullspace into eigenspaces of $D$. In the simplest case, these eigenspaces will all be one-dimensional. Now, the eigenvectors of $D$ are precisely the functions satisfying $Dy = \lambda y$ for some eigenvalue $\lambda$, which are precisely the exponential functions $y = e^{\lambda t}$. And it's not hard to see that such a function is a solution of the differential equation if and only if $\lambda$ is a root of the characteristic polynomial $\sum a_n \lambda^n$.

Example. Simple harmonic motion is governed by the differential equation $m D^2 + k D^0 = 0$, which has characteristic polynomial $m \lambda^2 + k = 0$. The roots of this polynomial are complex, but if we allow ourselves to work with complex numbers (formally, in the above situation we tensor with $\mathbb{C}$) we find that the eigenvalues are $\lambda = \pm i \sqrt{ \frac{k}{m} }$, so the set of solutions is all functions of the form

$$x = A e^{ i \sqrt{ \frac{k}{m} } t } + B e^{-i \sqrt{ \frac{k}{m} } t}.$$

By Euler's formula, if we restrict our solutions to be real we get the familiar periodic sine and cosine.

In general the eigenspaces will not be one-dimensional and then the theory of Jordan normal form applies. This occurs, for example, when finding the general form of damped harmonic motion.


If you have a linear system on matrix form, $dX/dt=AX$, where $X(t)$ is a vector in $\mathbf{R}^n$ and $A$ is an $n\times n$ constant real matrix, then $X(t) = \exp(\lambda t) V$ is a solution to the system if $V$ is an eigenvector of $A$ with eigenvalue $\lambda$. (This works since $\exp(\lambda t)$ is an eigenfunction of the differential operator $d/dt$ with eigenvalue $\lambda$.)

If $A$ is diagonalizable, then the general solution is a linear combination of terms like above: $X(t)=\sum_{k=1}^n c_k \exp(\lambda_k t) V_k$. (If the eigenvalues are complex, this is the general complex-valued solution; one can perform some standard tricks to extract the general real-valued solution from that.) Nondiagonalizable matrices $A$ cause a bit more trouble, but that can be handled too.

(Of course, this is related to what has been said in other answers already, but with a system of this type one sees clearly that the eigenvectors play a role too, not just the eigenvalues.)


The collection of all functions that are differentiable at least $n$ times (let's call it $\mathcal{F}_n$ just for the purposes of this post) form a vector space; and the operator $D$ defined by $D(f) = f'$ is linear ($D(f+g)=D(f)+D(g)$, and $D(\alpha f) = \alpha D(f)$ for all $f,g$ and scalars $\alpha$). So $D$ is a linear transformation from the vector space of functions that can be differentiated at least $n$ times, and functions that can be differentiated at least $n-1$ times.

Now, suppose you have a homogeneous linear differential equation: $$ f_n(x) y^{(n)}(x) + \cdots + f_1(x)y'(x) + f_0(x)y(x) = 0.$$ Then the collection of all function $y(x)$ that are solutions to this system form a subspace of $\mathcal{F}_n$: because if $y_1$ and $y_2$ are solutions, then so is $y_1+\alpha y_2$ for any scalar $\alpha$, and the zero function is certainly a solution. It is, in fact, the nullspace of a certain linear transformation, namely, the linear transformation $L$ given by $$L(y) = f_n(x)y^{(n)}(x)+\cdots + f_1(x)y'(x) + f_0(x)y(x).$$ So one can bring some linear algebra to bear on this problem; e.g., determining the dimension of the solution space, etc.

Also, in analogy to the case of systems of linear equations, consider a *non*homogeneous linear differential equation: $$f_n(x) y^{(n)}(x) + \cdots + f_1(x)y'(x) + f_0(x)y(x) = g(x).$$ Suppose you could find a particular solution $y_p$ to this equation. Then, if $y_h$ is any solution to the corresponding homogeneous equation, then $y_p+y_h$ is a solution to the nonhomogeneous equation as well; and if $z_p$ is another solution to the nonhomogeneous equation, then $y_p-z_p$ is a solution to the homogeneous equation. So every solution to the nonhomogeneous equation is of the form $y_p + y_h$, where $y_p$ is the particular solution to found, and $y_h$ is a solution to the associated homogeneous equation. This is exactly the same thing that happens with systems of linear equations (where the unknowns are numbers).

In many cases, you can show that any solutions to your equations will actually be infinitely differentiable (e.g., if your $f$s are infinitely differentiable). Then you can think of the linear transformation $L$ as a linear operator on the space of infinitely differentiable functions. Then eigenvectors and eigenvalues can come into play: eigenvectors and eigenvalues give you simpler ways of thinking about a linear transformation, so they give you simpler ways of thinking about this particular linear transformation (which happens to correspond to solutions of a differential equation).

Also, systems of linear differential equations very naturally lead to linear transformations where the eigenvectors and eigenvalues play a key role in helping you solve the system, because they "de-couple" the system, by allowing you to think of a complex system in which each of the variables affects the derivative of the others as a system in which you have some new variables that are completely independent of one another (or in the case of generalized eigenvectors, easily dependent on only some of the others). This makes the system easier to solve.


A good reference is here: http://www.sosmath.com/diffeq/system/linear/eigenvalue/eigenvalue.html