What is the difference between Eigenvectors and the Kernel or Null Space of a matrix?

I am just wondering what is the difference between Eigenvectors and the Kernel or Null Space of a matrix?

The kernel for matrix A is x where, Ax = 0 Isn't that what Eigenvectors are too?


Notation: Let $M_{n\times m}(\Bbb R)$ denote the set of all $n\times m$ matrices.

Let $T:M_{m\times 1}(\Bbb R) \to M_{n\times 1}(\Bbb R)$ be the transformation given by $T(\mathbf x) = A\mathbf x$ where $A$ is a given $n\times m$ matrix.

Definition: The kernel of $T$ is the set of all vectors $\mathbf x$ such that $T(\mathbf x) = \mathbf 0$.

What This Means: These are all of the vectors that get mapped to $\mathbf 0$ by $T$ (and hence by $A$). Notice that $A$ does not have to be a square matrix here.

Definition: Let $m=n$. A nonzero vector $\mathbf x\in M_{m\times 1}(\Bbb R)$ is an eigenvector of $T$ if $T(\mathbf x) = k\mathbf x$ for some $k\in\Bbb R$. In such a case, $k$ is called the eigenvalue of $T$ associated with eigenvector $\mathbf x$.

What This Means: Here's some motivation for the idea of eigenvectors. Let $A = \begin{bmatrix} 1 & 1 \\ 3 & -1\end{bmatrix}$. What does this matrix do? It's probably pretty hard to tell. But what if I told you that $A\begin{bmatrix} 1 \\ 1\end{bmatrix} = 2\begin{bmatrix} 1 \\ 1\end{bmatrix}$ and $A\begin{bmatrix} -1 \\ 3\end{bmatrix} = -2\begin{bmatrix} -1 \\ 3\end{bmatrix}$? Then maybe you start to see a picture. The matrix stretches vectors parallel to $\begin{bmatrix} 1 \\ 1\end{bmatrix}$ by a factor of $2$ and stretches and flips vectors parallel to $\begin{bmatrix} -1 \\ 3\end{bmatrix}$. That's the benefit of eigenvectors -- they give us some geometric information about what a linear transformation (or matrix) does. Notice however that eigenvectors are only defined when $A$ is a square matrix.


With the definitions out of the way, here's the relationship between the two. Let $A$ be a square matrix. Recall that the kernel of $T$ is the set of all vectors $\mathbf x$ such that $T(\mathbf x) = A\mathbf x = \mathbf 0$. But $\mathbf 0 = 0\mathbf x$. So the kernel is just the set of all eigenvectors of $T$ (or $A$) associated with the eigenvalue $0$ plus the zero vector.


The space of eigenvectors of $A$ with eigenvalue $\lambda$ is the Null Space of $A-\lambda I$. In particular, if we set $\lambda=0$, we see that the null space of $A$ is the eigenspace of the eigenvalue $0$.


The space of eigenvectors for an eigenvalue $\lambda$ is the kernel of $(A - \lambda\cdot\text{Id}$). In other words, they satisfy $Av = \lambda v$.