What's so useful about diagonalizing a matrix?

Solution 1:

I think, in short, the purpose is more to provide a characterization of the matrix you are interested in, in most cases. A "simple" form such as diagonal allows you to instantly determine rank, eigenvalues, invertibility, is it a projection, etc. That is, all properties which are invariant under the similarity transform, are much easier to assess.

A practical example: principal components is an orthogonal diagonalization which give you important information regarding the independent components (eigenvectors) in a system and how important each component is (eigenvalues) - so it allows you to characterize the system in a way which is not possible in the original data. http://en.wikipedia.org/wiki/Principal_component_analysis


I can't think of a case where diagonalization is used purely as a means to "simplify" calculation as it is computationally expensive - it is more of an end goal in itself.


Solution 2:

I'll add that while you mention computing integer powers of matrices, diagonalization helps in computing fractional powers and exponentiation. If you wanted to compute the matrix $\exp(\mathbf{A})$ you could either go the slow route and use the Taylor series giving: \begin{align}\mathbf{I}+\mathbf{A}+\frac{\mathbf{A}^2}{2!}+\frac{\mathbf{A}^3}{3!}+\dots+\frac{\mathbf{A}^n}{n!}+\dots\end{align}

or alternatively diagonalize $\mathbf{A}$ and it's as easy as doing $e^\lambda$ for each eigenvalue in the diagonal matrix. This significantly reduces the complexity for matrix exponentiation given a required precision.