dim$(V)$ = $n$, dim$(W)$ = $m$ $\implies$ dim($L(V,W)$) = $nm$

I am reading Hoffman & Kunze's chapter on linear transformations, with a view towards understanding dual spaces. (I primarily want to read Calculus on Manifolds; in the first chapter of that book, Spivak marks one exercise concerned with dual spaces as "soon-to-be-important". I never studied dual spaces in my linear algebra course, and I want to be sure I follow his development.)

H&K state the theorem that the space of linear transformations $L(V, W)$ from $n$-dimensional $V$ to $m$ dimensional $W$ has dimension $nm$. I am having a hard time understanding their proof; they basically just juggle $\Sigma$-notation. I somewhat follow the steps, which loosely are as follows: you let $\mathcal{B} = \{\alpha_1, ... \alpha_n\}$ be a basis for $V$, and $\mathcal{B'} = \{\beta_1, ... \beta_m\}$ a basis for $W$. For $1 \leq p \leq m$, $1 \leq q \leq n$, define

$$E^{(p,q)} : V \to W : c_1\alpha_1 + ... + c_q\alpha_q ... +c_n\alpha_n \to c_q\beta_p;$$

that is, let

$$E^{(p,q)}(\alpha_i) = \left\{ \begin{array}{} 0, & i \neq q \\ \beta_p, & i = q \end{array} \right.$$

and put $E^{(p,q)}(\gamma) = \sum_{i=1}^n c_iE^{(p,q)}(\alpha_i).$ Fix some arbitrary l.t. $T: V \to W$ such that

$$T(\alpha_i) = \sum_{p=1}^m A_{(p,i)} \beta_p = A_{(1,i)}\beta_1 + A_{(2,i)}\beta_2 + ... +A_{(m,i)}\beta_m$$

for some weights $A_{(1,i)}, A_{(2,i)}, ... A_{(m,i)}, 1 \leq i \leq n$. Now consider

$$ T'(\gamma) := \sum_{p=1}^m\sum_{q=1}^nA_{(p,q)}E^{(p,q)}(\gamma).$$

On the one hand,

\begin{align} T'(\gamma) = \sum_{p=1}^m\sum_{q=1}^nA_{(p,q)}E^{(p,q)}(\gamma) &= \sum_{p=1}^m\sum_{q=1}^nA_{(p,q)}E^{(p,q)}(c_1\alpha_1 + ... + c_q\alpha_q ... +c_n\alpha_n)\\ & = \sum_{p=1}^m\sum_{q=1}^nA_{(p,q)}c_q \beta_p\\ & = \sum_{p=1}^m (\sum_{q=1}^nA_{(p,q)}c_q) \beta_p \end{align}

On the other hand,

$$ T(\gamma) = \sum_{i=1}^n c_iT(\alpha_i) = \sum_{i=1}^n c_i\sum_{p=1}^m A_{(p,i)}\beta_p = \sum_{p=1}^m(\sum_{i=1}^nc_iA_{(p,i)})\beta_p$$

so $T' = T$. Then you show that all the $E^{whatever}$ are linearly independent, and you are "done".


The problem with this is that I have no idea what's going on here. When I focus, I can make sense of the $\Sigma$-notation, but otherwise I don't follow the proof. So, I have to ask:

  • What is the intuition for why the theorem is true?
  • Can anybody make the proof given more intuitive? (For example, what are the $E^{whatever}$ really doing? How did H&K know that we'd have $T' = T$ (how did they think of $T'$)?

Maybe matrices make it easier. Fix two bases $\{e_i\}$ for $V$ and $\{f_j\}$ for $W$. For every $T:V\longrightarrow W$, we denote $A_T$ the matrix of $T$ with respect to these bases. This is a well-defined linear (check) map from $L(V,W)$ to $M_{m\times n}(K)$.

As a linear operator is uniquely determined by its action on a basis, the latter is a bijection. As pointed out by Alex Becker, with this observation, you already have your dimension in a slightly informal way: $T$ is detemined by $\{T(e_1),\ldots,T(e_n)\}$, and each $T(e_i)$ is determined by its $m$ coefficients in $\{f_1,\ldots,f_m\}$. This makes $n\cdot m$ coeffiecients: your dimension.

Now with the isomorphism above: $$ \dim L(V,W)=\dim M_{m\times n}(K). $$ It is not hard to see that the matrices $E_{i,j}:=(\delta_{i,k}\delta_{j,l})$ (i.e. $1$ in $(i,j)$ position, $0$ elsewhere) for $1\leq i\leq m$ and $1\leq j\leq n$ constitute a basis of $M_{m\times n}(K)$. There are $mn$ of them. That's your dimension.