Left and Right inverses of linear operators

Let $X$ and $U$ be vector spaces over a field $F$, and let $T : X \to U$.

(a) If there exists an operator $S : U \to X$ such that $S(T(x)) =x$ for all $x \in X$, then $S$ is called a left inverse of $T$.

(b) If there exists an operator $S : U \to X$ such that $T(S(u)) =u$ for all $u \in U$, then $S$ is called a right inverse of $T$.

I'm trying to prove the following theorem.

Let $X$ and $U$ be vector spaces over a field $F$, and let $T: X\to U$ be linear.

(a) There exists a left inverse $S$ of $T$ iff $T$ is injective.

(b) There exists a right inverse $S$ of $T$ iff $T$ is surjective.

So far my study on linear algebra has been largely restricted to finite dimensional vector spaces but this problem, I think applies to general vector spaces. How can I solve this without resorting to basis?


Here is the proof for (a).

$(\Rightarrow)$ Suppose there exists a left inverse $S$ of $T$. To see that $T$ is injective, let $x,y\in X$ such that $T(x)=T(y)$. Then $$x=S(T(x))=S(T(y))=y$$ Hence $T$ is injective. (Note that this proof requires no linear algebra).

$(\Leftarrow)$ Suppose that $T$ is injective. Let $\{x_i:i\in I\}$ be a basis for $X$. Then $\gamma=\{T(x_i):i\in I\}$ is a linearly independent subset of $U$ (check this!) so there exists a basis $\beta$ of $U$ such that $\gamma\subset\beta$. Now, let $S:U\to X$ be the linear map defined on $\beta$ by $$ S(u)= \begin{cases} v & \text{if } u\in\gamma\text{ with }u=T(v) \\ 0 & \text{if } u\notin\gamma \end{cases} $$ Then for $x=\sum\lambda_i x_i\in X$ we have $$ ST(x)=ST\left(\sum\lambda_i x_i\right)=\sum\lambda_iST(x_i)=\sum\lambda_i x_i=x $$ so that $S$ is a left inverse of $T$. $\Box$

The key result here is that if $\gamma$ is a linearly independent subset of a vector space $V$, then there exists a basis $\beta$ of $V$ such that $\gamma\subseteq\beta$. This is a standard result covered in most linear algebra courses.

The proof for (b) is very similar.


Hint: If $\exists$ left inverse: $$x_1\ne x_2\implies S(T(x_1))\ne S(T(x_2))\implies T(x_1)\ne T(x_2)$$ (Why each step?)