Are all eigenvectors, of any matrix, always orthogonal?

Solution 1:

In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the corresponding eigenvectors are always orthogonal.

For any matrix M with n rows and m columns, M multiplies with its transpose, either M*M' or M'M, results in a symmetric matrix, so for this symmetric matrix, the eigenvectors are always orthogonal.

In the application of PCA, a dataset of n samples with m features is usually represented in a n* m matrix D. The variance and covariance among those m features can be represented by a m*m matrix D'*D, which is symmetric (numbers on the diagonal represent the variance of each single feature, and the number on row i column j represents the covariance between feature i and j). The PCA is applied on this symmetric matrix, so the eigenvectors are guaranteed to be orthogonal.

Solution 2:

Fix two linearly independent vectors $u$ and $v$ in $\mathbb{R}^2$, define $Tu=u$ and $Tv=2v$. Then extend linearly $T$ to a map from $\mathbb{R}^n$ to itself. The eigenvectors of $T$ are $u$ and $v$ (or any multiple). Of course, $u$ need not be perpendicular to $v$.