If $A$ is a complex matrix of size $n$ of finite order then is $A$ diagonalizable ?

Let $A$ be a complex matrix of size $n$ if for some positive integer $k$ , $A^k=I_n$ , then is $A$ diagonalizable ?


Solution 1:

Note that $m_A\mid X^k-1$; and $X^k-1$ splits completely with no repeated roots over $\Bbb C$, hence so does $m_A$, i.e. $A$ is diagonalizable.

Solution 2:

Yes, it is. One possible proof is as follows. From $A^k = I_n$ it follows that $A$ is invertible, so $0$ is not an eigenvalue of $A$. Now transform $A$ to Jordan normal form. Let $B$ be a Jordan block of $A$ of size $m$. From $A^k = I_n$ we get $B^k = I_m$. It's not too difficult to describe the powers of Jordan blocks explicitly. From such a description, it can be seen that a postive power of a Jordan block of size $m > 1$ with a nonzero eigenvalue cannot be $I_m$ (actually, in order to see this, it's enough to convince yourself that the entry in the jth row, (j+1)th column of the nth power of a jordan block with eigenvalue $\lambda$ is $n\lambda^{n-1}$). Since we have $B^k = I_m$, we must have $m = 1$. So the Jordan normal form of $A$ is diagonal.

Solution 3:

Nothing wrong with the other answers. I simply want to record the following simple argument specific to this particular case. Also, this can be seen as an application of the representation theory of cyclic groups as well as discrete Fourier analysis.

Let $\omega=e^{2\pi i/k}$ be primitive $k$th root of unity in $\Bbb{C}$. Diagonalizability of a matrix $A$ is equivalent to every vector being a sum of eigenvectors of $A$. So let $x\in\Bbb{C}^n$ be arbitrary.

For all indices $\ell=0,1,\ldots,k-1$, consider the vector $$ x_\ell=\frac1k(x+\omega^\ell Ax+\omega^{2\ell}A^2x+\cdots+\omega^{(k-1)\ell} A^{k-1}x).\tag{1} $$ As $A^k=I$, applying $A$ cyclically permutes the vectors $x, Ax, A^2x,\ldots,A^{k-1}x$, so we easily see that $$ Ax_\ell=\omega^{-\ell} x_\ell $$ for all $\ell$. In other words, whenever $x_\ell\neq0$ it is an eigenvector of $A$ belonging to the eigenvalue $\omega^{-\ell}$.

Furthermore, if $0<\ell<k$ we have $$ 1+\omega^\ell+\omega^{2\ell}+\cdots+\omega^{(k-1)\ell}=0.\tag{2} $$ This can be seen with the aid of the formula for a geometric sum. Or by using the fact that $\omega^\ell\neq1$ is a zero of $(z^k-1)/(z-1)=1+z+z^2+\cdots+z^{k-1}$. Using the equation $(2)$ when summing $\sum_\ell x_\ell$ of $(1)$ thus yields $$ \begin{aligned} x_0+x_1+x_2+\cdots+x_{k-1}&= \frac1k \sum_{j=0}^{k-1}\left(\sum_{\ell=0}^{k-1}\omega^{\ell j}\right)A^jx\\ &=\frac kk x+\sum_{j=1}^{k-1}\left(\sum_{\ell=0}^{k-1}\omega^{\ell j}\right)A^jx\\ &=x. \end{aligned} $$ This shows that $x$ is a sum of eigenvectors of $A$, and proves the main claim.


This is an extension of the common trick of getting eigenvectors of an order two operator. The formula $$ f(x)=\frac12(f(x)+f(-x))+\frac12(f(x)-f(-x)) $$ writing an arbitrary function as a sum of an even function and an odd function is, perhaps, the earliest encounter of this. In that case the operator is $T:f(x)\mapsto f(-x)$, we have $T^2=id$ and $-1$ is the second root of unity. The even (respectively odd) functions being eigenvectors of $T$ belonging to $\lambda=+1$ and $\lambda =-1$ respectively.