Wikipedia defines an eigenvector like this:

An eigenvector of a square matrix is a non-zero vector that, when multiplied by the matrix, yields a vector that differs from the original vector at most by a multiplicative scalar.

So basically in layman language: An eigenvector is a vector that when you multiply it by a square matrix, you get the same vector or the same vector multiplied by a scalar.

There are a lot of terms which are related to this like eigenspaces and eigenvalues and eigenbases and such, which I don't quite understand, in fact, I don't understand at all.

Can someone give an explanation connecting these terms? So that it is clear what they are and why they are related.


Solution 1:

Eigenvectors are those vectors that exhibit especially simple behaviour under a linear transformation: Loosely speaking, they don't bend and rotate, they simply grow (or shrink) in length (though a different interpretation of growth/shrinkage may apply if the ground field is not $\mathbb R$). If it is possible to express any other vector as a linear combination of eigenvectors (preferably if you can in fact find a whole basis made of eigenvectors) then applying the - otherwise complicated - linear transformation suddenly becomes easy because with respect to a basis of eigenvectors the linear transformation is given simply by a diagonal matrix.

Especially when one wants to investigate higher powers of a linear transformation, this is practically only possible for eigenvectors: If $Av=\lambda v$, then $A^nv=\lambda^nv$, and even exponentials become easy for eigenvectors: $\exp(A)v:=\sum\frac1{n!}A^n v=e^\lambda v$. By the way, the exponential functions $x\mapsto e^{cx}$ are eigenvectors of a famous linear tranformation: differentiation, i.e. mapping a function $f$ to its derivative $f'$. That's precisely why exponetials play an important role as base solutions for linear differential equations (or even their discrete counterpart, linear recurrences like the Fibonacci numbers).

All other terminology is based on this notion: An (nonzero) eigenvector $v$ such that $Av$ is a multiple of $v$ determines its eigenvalue $\lambda$ as the scalar factor such that $Av=\lambda v$. Given an eigenvalue $\lambda$, the set of eigenvectors with that eigenvalue is in fact a subspace (i.e. sums and multiples of eigenvectors with the same(!) eigenvalue are again eigen), called the eigenspace for $\lambda$. If we find a basis consisting of eigenvectors, then we may obviously call it eigenbasis. If the vectors of our vector space are not mere number tuples (such as in $\mathbb R^3$) but are also functions and our linear transformation is an operator (such as differentiation), it is often convenient to call the eigenvectors eigenfunctions instead; for example, $x\mapsto e^{3x}$ is an eigenfunction of the differentiation operator with eigenvalue $3$ (because the derivative of it is $x\mapsto 3e^{3x}$).

Solution 2:

As far as I understand it, the 'eigen' in words like eigenvalue, eigenvector etc. means something like 'own', or a better translation in English would perhaps be 'characteristic'.

Each square matrix has some special scalars and vectors associated with it. The eigenvectors are the vectors which the matrix preserves (up to scalar multiplication). As you probably know, an $n\times n$ matrix acts as a linear transformation on an $n$-dimensional space, say $F^n$. A vector and its scalar multiples form a line through the origin in $F^n$, and so you can think of the eigenvectors as indicating lines through the origin preserved by the linear transformation corresponding to the matrix.

Defn Let $A$ be an $n\times n$ matrix over a field $F$. A vector $v\in F^n$ is an eigenvector of $A$ if $Av = \lambda v$ for some $\lambda$ in $F$. A scalar $\lambda\in F$ is an eigenvalue of $A$ if $Av = \lambda v$ for some $v\in F^n$.

The eigenvalues are then the factors by which these special lines through the origin are either stretched or contracted.

Solution 3:

The term "eigen" comes from german and means "proper". It means that the eigenvectors and eigenvalues have a special status with respect to the operator/matrix you are studying. In this case, it is that they are essentially invariant with respect to applications of the operator/matrix. After application you get back the same eigenvector multiplied by a factor that factor being the eigenvalue.

Since some people doubt that "proper" is an appropriate translation in this context, I'll have you know that "proper vector" is sometimes used as a synonym for "eigenvector". Also see this and this.

So, I think the meaning here is really that of having a special or peculiar status.

Solution 4:

Matrices can be viewed as linear maps on vector spaces...in fact if working over a nice field such as the field of real numbers, matrices give geometrical transformations.

For example the reflection in the line $y=x$ in $\mathbb{R}^2$ can be simulated by multiplication by the matrix:

$\left(\begin{array}[aa]\\ 0 1 \\ 1 0\end{array}\right)$

Now it is clear that geometrically there are certain symmetries here. For example if you choose any point on the line $y=x$ it gets sent to itself and if you choose any point on the line $y=-x$ then it gets sent to a point in the complete opposite direction from the origin.

This information is essentially what the eigenvalues and eigenvectors of the above matrix captures. The eigenvectors are the vectors on those two lines and the eigenvalues are the corresponding scalar multiple of the vector you get after applying the reflection.

Things on the line $y=x$ got sent to themselves, i.e. $Av = v$, a scalar multiple of $1$.

Things on the line $y=-x$ got sent to the negative of themselves, i.e. $Av = -v$, a scalar multiple of $-1$.

Thus we expect two eigenvalues $\pm 1$ and two "eigenspaces", $V_1, V_{-1}$ consisting of all vectors of eigenvalues $1$ and $-1$ respectively.

These spaces are exactly the vectors lying on the lines $y=x$ and $y=-x$ respectively.

Of course there are ways to work out these things using only the matrix but hopefully you can see a bit of significance to them now. They come in useful in many areas of maths.