Demonstration: If all vectors of $V$ are eigenvectors of $T$, then there is one $\lambda$ such that $T(v) = \lambda v$ for all $v \in V$.

Let $T: V \rightarrow V$ be a linear operator.

I need to demonstrate that if all nonzero vectors of $V$ are eigenvectors of $T$, then there is one specific $\lambda \in K$ such that $T(v) = \lambda v$, for all $v \in V$.

I understand that, if all nonzero vectors of $V$ are eigenvectors of $T$, then $T$ must be a scaling transformation. It just stretch or shrinks vectors, but doesn't change their directions.

So, the statement says that if it happens, then, there is a single $\lambda$ such that $T(v) = \lambda v$. In other words, if there is such transformation, then it scales all vectors by the same scalar $\lambda$.

Applying the transformation to our standard basis vectors, we have: $$ T(e_1) = \lambda_1 e_1 \\ T(e_2) = \lambda_2 e_2 \\ \vdots \\ T(e_n) = \lambda_n e_n $$

I understand I need to prove that $\lambda_1 = \lambda_2 = \dots = \lambda_n$, but I can't see how!

EDIT

$$ v = c_1e_1 + c_2e_2 + \dots + c_ne_n \\ T(v) = \mu v = \lambda_1c_1e_1 + \lambda_2c_2e_2 + \dots + \lambda_nc_ne_n \\ $$

Since what's multiplying $v$ coordinates is $\lambda_i$, then all of them must be $\mu$. I'm not sure how to 'mathematize' this. Is this idea correct?

EDIT 2 Extending the left hand side of EDIT 1, we have: $$ \mu v = \lambda_1c_1e_1 + \dots + \lambda_nc_ne_n \\ \mu(c_1e_1 + \dots + c_ne_n) = \lambda_1c_1e_1 + \dots + \lambda_nc_ne_n \\ \mu c_1e_1 + \dots + \mu c_ne_n = \lambda_1c_1e_1 + \dots + \lambda_nc_ne_n \\ $$

And since $e_i$ are linearly independent, $\mu = \lambda_1 = \lambda_2 = \dots = \lambda_n$. Is this proof correct?


Since all $v\in V$ are eigenvectors, we can choose $e_i$, the $i$th unit vector. Then by assumption we have $T e_i = \lambda_i e_i$ for some $\lambda_i$. It follows that $T$ is diagonal, with elements $\lambda_1,...,\lambda_n$ on the diagonal.

Now choose $v=e_1+...+e_n$, again for some $\lambda$, we have $Tv=\lambda v$, so we have $$T v = T(e_1+...+e_n) = \lambda_1 e_n +... + \lambda_n e_n = \lambda (e_1+...+e_n).$$ Since the $e_i$ are linearly independent, it follows that $\lambda = \lambda_1 = ... = \lambda_n$. Hence $Tx = \lambda x$, $\forall x$.


Hint: Assume that $Tu=\lambda u$ and $Tv=\mu v$ for some nonzero vectors $u$ and $v$ and some $\lambda$ and $\mu$.

  • Show that $\{u,v\}$ is a linearly independent family.
  • Show that $\{u+v,au+bv\}$ is a linearly independent family, for every $a\ne b$.
  • Show that $\{T(u+v),u+v\}$ is not linearly independent.
  • Conclude that $\lambda=\mu$.

Assume for a contradiction that $v,w$ are eigenvectors for $\lambda\neq\mu$, respectively, (in particular they are nonzero) and that $T(v+w)=\nu(v+w)$ for some $\nu\in K$. Since any nonzero scalar multiple of an eigenvector is an eigenvector for the same eigenvalue, $v$ and $w$ cannot be linearly dependent. Then by this linear independence $\nu v+\nu w=T(v+w)=T(v)+T(w)=\lambda v+\mu w$ implies $(\nu,\nu)=(\lambda,\mu)$, which contradicts $\lambda\neq\mu$.

So if all nonzero vectors are eigenvectors, then all of them must be so for the same eigenvalue$~\lambda$, and one has $T=\lambda I$. (Pedantically, if $\dim V=0$ there are no eigenvectors at all, and one is free to choose$~\lambda$; "specific" in the question is not justified in this case.)