Confusion in proof of theorem 8.33 in Axler's Linear Algebra done right

Suppose $V$ is a complex vector space and $T\in L(V)$ is invertible then $T$ has a square root. Let dim $V=n$.

The proof goes along these lines: Let $\lambda_i'$s, ($1\le i\le m)$ be the distinct eigenvalues of $T$. Let $G(\lambda_i,T)$ denote the generalized eigenspace corresponding to $\lambda_i$. It can be shown that $G(\lambda_i,T)=\operatorname{null} (T-\lambda_iI)^n$ and $N_i:=(T-\lambda_iI)|_{G(\lambda_i,T)}$ is a nilpotent operator on $G(\lambda_i,T)$.


The proof then shows that there exists an $R_i\in L(G(\lambda_i,T))$ such that $R_i^2=T|_{G(\lambda_i,T)}$. Since $V$ can be decomposed as a direct sum of generalized eigenspaces, it follows that $V=G(\lambda_1,T)\oplus G(\lambda_2,T)+\cdots+\oplus G(\lambda_m,T)$. So for every $v\in V$, there exist unique $u_i\in G(\lambda_i,T)$ such that $v=\sum_{i=1}^m u_i$.

Now define $Rv:=R_1u_1+...+R_mu_m$. It should be verified that this $R$ is a square root of $T$. The proof ends here.


I'm trying to verify that $R$ is indeed a square root of $T$. I tried as follows:

Since $V$ is a direct sum of generalized eigenspaces, there exists a basis of $V$ consisting of generalized eigenvectors of $T$. Let one such basis be $e_i$'s, $1\le i \le n$. WLOG, let $e_j$ belong to $G(\lambda_j,T)$. So $Re_j=R_je_j$ (by definition of $R$).

It's known that $R_j^2=T|_{G(\lambda_J,T)}$ so $\color{blue}{R_jRe_j}=R_j^2e_j=Te_j$.

My concern is the blue colored part: I think that the blue colored part makes sense iff $Re_j\in G(\lambda_j,T)$ but that doesn't have to be true in general (that is, If subspace $U$ is $T-$ invariant, then there is no reason to believe that $U$ is $\sqrt T-$ variant too.) Assuming that $Re_j$ is indeed in $G(\lambda_j,T)$, it follows that indeed $R^2e_j=Te_j$ and if this holds for every $1\le j\le n$, then it follows that $R$ is indeed a square root of $T$ and we are done.


So my question/ confusion is the following:

In this case, is it true that $Re_j$ is in $G(\lambda_j,T)$? I think yes else the equation involving blue colored part won't make sense.

What am I missing? Please help. Thanks.


Solution 1:

I understand with help from @runway44 that the following holds for every $1\le j\le n$:

$R^2e_j=RRe_j=R(Re_j)=R(R_je_j)\overbrace{=}^{R_je_j\in L(G(\lambda_j,T))}R_j(R_je_j)=R_j^2e_j=T|_{G(\lambda_j,T)}e_j=Te_j$

Hence it follows that: $R^2=T$.