Existence of T-invariant complement of T-invariant subspace when T is diagonalisable

Let $V$ be a complex linear space of dimension $n$. Let $T \in End(V)$ such that $T$ is diagonalisable. Prove that each $T$-invariant subspace $W$ of $V$ has a complementary $T$-invariant subspace $W'$ such that $V= W \oplus W'$.

Note: Let $\{e_1,...e_n\}$ be the set of eigenvectors together with eigenspaces $V_{\lambda_1},...V_{\lambda_n}$ of $T$. It's sufficient to show that every $T$-invariant subspace $W$ must be a direct sum of eigenspaces, then it'll be trivial to find $W'$ (just take the rest eigenspaces not in the direct sum and glue them to $W$).. But how to prove $W$ is a direct sum of eigenspaces?


The minimal polynomial is of the form \begin{equation} p=(x-c_1)(x-c_2)\cdots (x-c_k), \end{equation} where $c_1,c_2,\ldots,c_k$ are the distinct eigenvalues of $T$.

By primary decomposition \begin{equation} V=W_1 \oplus W_2 \oplus \cdots \oplus W_k, \end{equation} where $W_i$ is the eigenspace corresponding to $c_i$, $1\leq i \leq k$.

From Hoffman & Kunze, Page 226, Exercise 10, one should be able to see that \begin{equation} W=(W\cap W_1) \oplus (W\cap W_2) \oplus \cdots \oplus (W\cap W_k). \end{equation} Clearly, $W\cap W_i$ is $T$-invariant, $1\leq i \leq k$.

Let $\{\alpha_1,\alpha_2,\ldots,\alpha_{r_i} \}$ be an ordered basis for $W\cap W_i$. Since $W\cap W_i$ is a subspace of the eigenspace $W_i$, $\{\alpha_1,\alpha_2,\ldots,\alpha_{r_i} \}$ can be extended to $\{\alpha_1,\alpha_2,\ldots,\alpha_{r_i},\alpha_{r_i+1},\ldots,\alpha_{s_i} \}$, a basis for $W_i$. Let $V_i$ be the subspace spanned by $\{\alpha_{r_i+1},\ldots,\alpha_{s_i} \}$. Then $W_i=(W\cap W_i)\oplus V_i$.

Hence \begin{equation} V=(W\cap W_1)\oplus V_1 \oplus (W\cap W_2)\oplus V_2 \oplus \cdots \oplus (W\cap W_k)\oplus V_k, \end{equation} i.e., $W$ has $T$-invariant complementary subspace of $V$, $V_1\oplus V_2 \oplus \cdots \oplus V_k$.


Based on the hint $W=(W \cap V_{\lambda1}) \oplus...\oplus(W \cap V_{\lambda_s})$ where $\{\lambda_1,...\lambda_s\}$ is the set of eigenvalues one way to show it is as follows:

We can prove the following theorem: If $v_1 + v_2 + \cdots + v_k \in W$ and each of the $v_i$ are eigenvectors of $A$ with distinct eigenvalues, each of the $v_i$ lie in $W$.

Proof: Proceed by induction. If $k = 1$ there is nothing to prove. Otherwise, let $w = v_1 + \cdots + v_k$, and $\lambda_i$ be the eigenvalue corresponding to $v_i$. Then:

$$Aw - \lambda_1w = (\lambda_2 - \lambda_1)v_2 + \cdots + (\lambda_k - \lambda_1)v_k \in W$$

By induction, $(\lambda_i - \lambda_1)v_i \in W$, and since the eigenvalues $\lambda_i$ are distinct, $v_i \in W$ for $2 \leq i \leq k$, then we also have $v_1 \in W \quad \square$

Now each $w \in W$ can be written as a finite sum of nonzero eigenvectors of $A$ with distinct eigenvalues, and by the theorem these eigenvectors lie in $W$.Then we have $W = \bigoplus_{\lambda \in F}(W \cap V_{\lambda})$ as desired (where $V_{\lambda} = \{v \in V\mid Av = \lambda v\}$).


I will suppose there are $k$ distinct eigenvalues $\lambda_1,\ldots,\lambda_k$ (where $k$ may be less than the dimension$~n$).

Since $T$ is diagonalisable, it has a minimal polynomial $\mu_T$ that is split with simple roots; indeed one has $\mu=(X-\lambda_1)\ldots(X-\lambda_k)$. Since for the restriction $T|_W$ of$~T$ to$~W$ one certainly has $\mu[T|_W]=0$, this restriction is also diagonalisable, with its eigenvalues among $\{\lambda_1,\ldots,\lambda_k\}$, and each eigenspace of $T|_W$ for some$~\lambda_i$ is a subspace of the eigenspace of$~T$ for$~\lambda_i$. It now suffices to choose in each eigenspace of$~T$ a complementary subspace to the eigenspace of$~T|_W$, or the whole eigenspace (a complement of $\{0\}$) in case the eigenvalue does not occur as eigenvalue of$~T|_W$. Now take $W'$ to be the (direct) sum of those complementary subspaces.


Let $\{e_1,\dots,e_n\}$ be the set of eigenvectors and define $W'$ as the complement of $W$ with respect to $V$. Without losing generality, let $W\ne\{0\}$ and $W\ne V$ (these cases are trivial). For each $e_i$, it should happen that $e_i\in W$ or $e_i\in W'$. Since we assume that $W\ne\{0\}$ and $W\ne V$, this implies that $W'\ne\{0\}$ and $W'\ne V$. If $w\in W$, then $w=a_1e_1+\ldots+a_ne_n$, and this implies $a_{k+1}e_{k+1}+\ldots+a_ne_n=w-a_1e_1-\ldots-a_ke_k\in W$, then $a_{k+1}=\ldots=a_n=0$, so $span\{e_1,...e_k\}=W$ and $span\{e_{k+1},...e_n\}=W'$, for some $k\ge1$. This proves that $V= W \oplus W'$ since $\operatorname{span}\{e_1,\dots,e_k\} \cap \operatorname{span}\{e_{k+1},\dots,e_n\}=\varnothing$. They are invariant under $T$ since if $e_i\in W$ (or $e_i\in W^{\mathrm c}$) then $T(e_i)=\lambda_ie_i\in W$ (or $\lambda_ie_i\in W^{\mathrm c}$).