What is the mechanism of Eigenvector? [closed]

I have studied EigenValues and EigenVectors but still what I can't see is that how EigenVectors become transformed or rotated vectors.


Matrix as a map

How does a matrix transform the locus of unit vectors?

Pick an example matrix such as $$ \mathbf{A} = \left[ \begin{array}{rr} 1 & -1 \\ 0 & 1 \\ \end{array} \right]. $$ As noted by @Widawensen, a convenient representation of unit circle is the locus of vectors $\mathbf{S}$: $$ \mathbf{S} = \left[ \begin{array}{l} \cos \theta \\ \sin \theta \end{array} \right], \quad 0 \le \theta \lt 2\pi $$ The matrix product shows the mapping action of the matrix $\mathbf{A}$: $$ \mathbf{A} \mathbf{S} = \left[ \begin{array}{cc} \cos (\theta )-\sin (\theta ) \\ \sin (\theta ) \end{array} \right] $$

The plots below show the colored vectors from the unit circle on the left. On the right we see how the matrix $\mathbf{A}$ changes the unit vectors.

Matrix as map

Singular value decomposition

To understand the map, we start with the singular value decomposition: $$ \mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*} $$ The beauty of the SVD is that every matrix has a singular value decomposition (existence); the power of the SVD is that it resolves the four fundamental subspaces.

The singular value decomposition is an eigendecomposition of the matrix product $\mathbf{A}^{*} \mathbf{A}$. The singular values are the square root of the nonzero eigenvalues $$ \sigma \left( \mathbf{A} \right) = \sqrt{ \lambda \left( \mathbf{A}^{*} \mathbf{A} \right) } $$ The singular values are, by construction, positive and are customarily ordered. For a matrix of rank $\rho$, the expression is $$ \sigma_{1} \ge \sigma_{2} \ge \dots \ge \sigma_{\rho} > 0 $$ The normalized eigenvectors are the column vectors of the domain matrix $\mathbf{V}$. The column vectors for the codomain matrix $\mathbf{U}$ are constructed via $$ \mathbf{U}_{k} = \sigma_{k}^{-1} \left[ \mathbf{A} \mathbf{V} \right]_{k}, \quad k=1,\rho. $$

Graphically, the SVD looks like this:

Domain matrices from SVD

The first column vector is plotted in black, the second in blue. Both coordinate systems are left-handed (determinant = -1). The SVD orients these vectors to align the domain and the codomain.

Notice in the mapping action that some vectors shrink, others grow. The domain and codomain have different length scales, and this is captured in the singular values. Below, the singular values are represented as an ellipse with equation $$ \left( \frac{x}{\sigma_{1}} \right)^{2} + \left( \frac{y}{\sigma_{2}} \right)^{2} = 1. $$

Ellipse of singular values

SVD and the map

Finally, we bring the pieces together by taking the map image $\mathbf{A}\mathbf{S}$ and overlaying the basis vectors from the codomain $\mathbf{U}$, scaled by the singular values.

Map with U vectors

The black vector is $\sigma_{1} \mathbf{U}_{1}$, blue is $\sigma_{2} \mathbf{U}_{2}$.

Powers

Repeated applications of the map accentuate the effects of the map. For this example $$ \mathbf{A}^{k} = \left[ \begin{array}{cr} 1 & -k \\ 0 & 1 \\ \end{array} \right], \quad k = 1, 2, 3, \dots $$ The first maps in the sequence are shown below on a common scale:

enter image description here


Linear maps $T:\>V\to W$ make sense between any two vector spaces $V$, $W$ over the same ground field $F$.

The notions of eigenvalues and eigenvectors only make sense for linear maps $T:\>V\to V$ mapping some vector space $V$ into itself. A vector $v\ne0$ that is "by coincidence" mapped onto a scalar multiple $\lambda v$ of itself is called an eigenvector of $T$. Such a vector is maybe shrinked or stretched by $T$, but it is not "rotated" or "sheared" in any way. The numbers $\lambda$ occurring in such circumstances are called eigenvalues of $T$. It is a miracle that (in the finite dimensional case) a given $T$ may only have a finite number of eigenvalues.