Prove that T is diagonalizable if and only if the minimal polynomial of T has no repeated roots.

Prove that T is diagonalizable if and only if the minimal polynomial of T has no repeated roots. EDIT: ( Over $\Bbb C $ ) though it is obvious i am working over $\Bbb C $ as one of my statements is not true over $ \Bbb R $

I would like a better proof of this result what i did is below, there is the same question on here somewhere but only has an answer to one direction im looking for both.

I proved the result by using a different equivalent statement to diagonalizable looking for any complete proof that is shorter.

We notice that it is equivalent to prove that V has a basis consisting of eigenvectors of T iff the minimal polynomial of T has no repeated roots by theorem.

$(\Rightarrow ) $ First suppose that there is a basis $\beta = (v_1,\cdots , v_n ) $ consisting of eigenvectors of T. let $\lambda_1 , \cdots , \lambda_m $ be distinct eigenvalues of T. Then for each $ v_i $ there exists a $\lambda_k $ with $(T- \lambda_k I) v_i =0 $ it then follows that $(T- \lambda_1 I) \cdots (T- \lambda_mI) v_i =0 $ for each i as we can commute the operators. Since an operator that sends each vector in a basis to the $0$ vector is the $0$ operator we have that $(T- \lambda_1 I) \cdots (T- \lambda_mI) =0 $ Thus the polynomial $(z-\lambda_1) \cdots (z-\lambda_m ) $ when applied to T gives 0. but by theorem we know that that the minimal polynomial of T is a divisor of $(z-\lambda_1) \cdots (z-\lambda_m ) $ which has no repeated roots so the minimal polynomial cannot possibly have repeated roots the result follows.

$(\Leftarrow ) $ Let us assume that the minimal polynomial has no repeated roots; if we let $ \lambda_1 \cdots \lambda_m $ denote the distinct eigenvalues of T, this means the minimal polynomial of T is $(z-\lambda_1) \cdots (z-\lambda_m ) $ It follows that $(T- \lambda_1 I) \cdots (T- \lambda_mI) =0 $ Let $U_m $ be the subspace of a generalized eigenvectors corresponding to the eigenvalue $\lambda_m $. Since $ U_m $ is invariant under T by theorem we consider $ v\in U_m $ let $u= (T- \lambda_m I) v $ it follows that $u\in U_m $ Hence $$ (T|_{U_m} - \lambda_1 I ) \cdots (T|_{U_m} - \lambda_{m-1}I) u = (T- \lambda_1 I) \cdots (T- \lambda_mI) v =0 $$

by theorem we have that $( T- \lambda_m I )|_{U_m} $ is nilpotent by previous question we have that 0 is the only eigenvalue of $( T- \lambda_m I )|_{U_m} $. Thus $T|_{U_m} - \lambda_jI $ is an invertable operator on $U_m $ for $j= 1, \cdots , m-1 $ it then follows by $$ (T|_{U_m} - \lambda_1 I ) \cdots (T|_{U_m} - \lambda_{m-1}I) u = (T- \lambda_1 I) \cdots (T- \lambda_mI) v =0 $$ that $u=0$ in other words, $v$ is an eigenvector of T!

We have shown that every generalized eigenvector of T corresponding to the eigenvalue $\lambda_m $ is an eigenvector of T. However we choose $ \lambda_m $ arbitrarily we could of just of easily relabeled the eigenvalues so that any of them was called $ \lambda_m $. Therefore we have that every generalized eigenvector of T is actually an eigenvector of T. By theorem we have that there is a basis for V consisting of generalized eigenvectors of T but by above we have that there is a basis of V consisting of eigenvectors of T the desired result.


Solution 1:

Clearly $T$ is diagonalizable if and only if we can decompose $V$ into a direct sum of eigenspaces $$V = \ker (T-\lambda_1I) \dot+ \ker(T - \lambda_2 I) \dot+ \cdots \dot+\ker(T - \lambda_k I)$$

since we can then take a basis of the form $$(\text{basis for }T-\lambda_1I, \text{basis for }T-\lambda_2I, \ldots, \text{basis for }T-\lambda_nI)$$ which yields a diagonal matrix representation of $T$.

You have already handled the direction ($T$ is diagonalizable $\implies$ minimal polynomial has no repeated roots).

Conversely, assume that the minimal polynomial $\mu_T$ has no repeated roots. Note that the above sum is direct:

$$x \in \ker(T - \lambda_i I) \cap \ker(T - \lambda_j I) \implies \lambda_ix = Tx = \lambda_jx \implies i = j \text{ or } x = 0$$

It remains to prove that every $x$ can be written in the form $x = x_1 + \cdots + x_n$ with $x_i \in \ker(T - \lambda_iI)$.

Using the partial fraction decomposition we obtain:

$$\frac1{\mu_T(x)} = \frac1{(x-\lambda_1)\cdots(x-\lambda_k)} = \sum_{i=1}^k \frac{\eta_i}{(x-\lambda_i)}$$

for some scalars $\eta_i$.

Define $$Q_i(x) = \frac{\eta_i \mu_T(x)}{x - \lambda_i}$$ so that $\sum_{i=1}^n Q_i = 1$ and $(x-\lambda_i)Q_i(x) = \eta_i \mu_T(x)$.

Finally, notice that the desired decomposition is given by $$x = Q_1(T)x + Q_2(T)x + \cdots + Q_k(T)x$$

with $Q_i(T) x \in \ker (T - \lambda_i I)$ since

$$(T - \lambda_i I) Q_i(T)x = \eta_i \mu_T(T)x = 0$$

Solution 2:

How short this proof can be made depends entirely on your background. Here's a short one that I hope will be accessible to you.

Background facts:

I'm going to assume you are familiar with the notion of a direct sum. In particular, if $T$ acts on $V$ and $V=W\oplus Z$ with $TW\subseteq W$ and $TZ\subseteq Z$, then $T$ splits over the direct sum, and we have $T=T|_W\oplus T|_Z$. Then if $m_X$ denotes the minimal polynomial and $p_X$ denotes the characteristic polynomial of $X$, then whenever we have $A=B\oplus C$, $p_A(t)=p_B(t)p_C(t)$, and $m_A(t)=\newcommand{\lcm}{\operatorname{lcm}}\lcm(m_B(t),m_C(t))$. We'll use these two facts.

Proof:

Now since you're familiar with the generalized eigenspaces, which I will denote $E_{\lambda_i}$, note that $V=\bigoplus_i E_{\lambda_i}$ and $TE_{\lambda_i}\subseteq E_{\lambda_i}$. Thus $m_T(t) = \lcm\{m_{T|_{E_{\lambda_i}}}(t):i\}$. However, $m_{T|_{E_{\lambda_i}}}(t)=(t-\lambda_i)^{n_i}$ where $n_i$ is the least integer such that $(T-\lambda_i)^{n_i}E_{\lambda_i}=0$, so they are all relatively prime. Hence, $m_T(t) = \prod_i (t-\lambda_i)^{n_i}$.

Thus the minimal polynomial of $T$ has no repeated roots if and only if the $n_i$s are 1. Then since $n_i$ by definition is the least integer such that $(T-\lambda_i)^{n_i} E_{\lambda_i}=0$, $n_i=1$ for all $i$ if and only if $T|_{E_{\lambda_i}}=\lambda_i$ for all $i$. And this is the case if and only if the generalized eigenspace is the eigenspace, hence if and only if $T$ is diagonalizable.

Alternative Methods:

Another approach would be to use Jordan canonical form, although I'm not sure whether or not you're familiar with it.