Minimal polynomial of diagonalizable matrix

Prove that a matrix $A$ over $\mathbb{C}$ is diagonalizable if and only if its minimal polynomial's roots are all of algebraic multiplicity one.


Solution 1:

Evaluating a polynomial$~P$ at a matrix commutes with change of basis: $C^{-1}P[A]C=P[C^{-1}AC]$ for any invertible matrix $C$. This means that the polynomials annihilating $A$ are the same as those annihilating $C^{-1}\!AC$, and in particular their minimal polynomials are the same. Now if $A$ is diagonalisable then there is some diagonal $D=C^{-1}\!AC$, and evaluation $P$ at $D$ is a diagonal matrix with diagonal entries obtained by evaluating $P$ at the corresponding diagonal entry (eigenvalue) of$~D$. It follows that in this case the minimal polynomial of$~D$, and of$~A$, is a product of distinct factors $(X-a_i)$, one for each distinct eigenvalue$~a_i$; in particular it has simple roots.

Conversely if any product $P$ of distinct factors $(X-a_i)$ has $P[A]=0$ (in particular if the minimal polynomial of$~A$ has this form) then because all these factors are pairwise relatively prime the kernel decomposition theorem says $$ V=\ker(0)=\ker(P[A])= \ker(A-a_1I)\oplus\ker(A-a_2I)\oplus\cdots\oplus\ker(A-a_kI), $$ and the nonzero factors on the right are the eigenspaces of $A$, showing that $A$ is diagonalisable.

Solution 2:

Without resorting to Jordan normal form, suppose

$A\sim $$ \left(\begin{matrix} a_{11} & 0 & \ldots & 0\\ 0 & a_{22} & \ldots &0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 &\ldots & a_{nn} \end{matrix}\right) $

We rewrite the set $\{a_{11},\ldots ,a_{nn}\}$ as $\{b_1,\ldots,b_d\}$ where the $b_i$ are distinct complex numbers.

Now consider the polynomial $\displaystyle P=\prod_{k=1}^d(X-b_k)$.

Prove (if needed) that $P$ annihilates $A$.

$P$ has simple roots and $A$'s minimal polynomial divides $P$.

Therefore $A$'s minimal polynomial's roots are all of algebraic multiplicity $1$.