Proving isomorphism between linear maps and matrices
Theorem: Let $V$ and $W$ be finite dimensional vectorspaces over the same field $F$, with dimensions $n$ and $m$ respectively. Suppose also that $\beta$ and $\gamma$ are ordered bases for resp. $V$ and $W$. Then the function $\Psi : \mathcal{L}(V, W) \rightarrow M_{m \times n}(F)$, defined as $\Psi(T) = [T]_{\beta}^{\gamma}$ for an arbitrary $T \in \mathcal{L}(V,W)$, is an isomorphism.
Attempt at proof: We need to show that it is bijective, and hence an isomorphism. This means that for every $m \times n$-matrix $A$ we need to find an unique linear map $T: V \rightarrow W$ such that $\Psi(T)=A.$ So let $\beta = \left\{v_1, v_2, \ldots, v_n\right\}$ and $\gamma = \left\{w_1, w_2, \ldots, w_m\right\}$ be ordered bases for $V$ and $W$. Then we know already that there exists an unique linear map $T: V \rightarrow W$ such that for $1 \leq j \leq n$ \begin{align*} T(v_j) = \sum_{i=1}^{m} a_{ij} w_i \end{align*} But this means that $[T]_{\beta}^{\gamma} = A$, so $\Psi(T) = A$. Hence $\Psi$ is an isomorphism.
Can someone check if my proof is sound and valid? If not, where did I go wrong? Thanks in advance.
Solution 1:
Seems good to me. There is only one thing I want to point out (but perhaps you know this): you only proved that $\Psi$ in surjective with this. However, since $\mathcal{L}(V,W)$ and $M_{m \times n}(F)$ are finite dimensional vector spaces, $\Psi$ is an isomorphism $\iff \Psi$ is injective $\iff \Psi$ is surjective.
To prove linearity, we must prove that given $\lambda \in F$, and $T,S: V \to W$ linear maps, it is true that: $$[T+\lambda S]_\beta^\gamma = [T]_\beta^\gamma+\lambda[S]_\beta^\gamma.$$ Write $c_{ij} = \left([T+\lambda S]_\beta^\gamma\right)_{ij}, a_{ij} = ([T]_\beta^\gamma)_{ij}$ and $b_{ij} = ([S]_\beta^\gamma)$, and verify that $c_{ij}=a_{ij}+\lambda b_{ij},$ by taking an arbitrary column vector ${\bf x} = [x_1,\cdots,x_n]_\beta^T$, and using the definitions of $[T+\lambda S]_\beta^\gamma, [T]_\beta^\gamma$ and $[S]_\beta^\gamma$ to compute $[T+\lambda S]_\beta^\gamma{\bf x}$ and similarly for the other ones.
It will suffice to check for the vectors of the basis $\beta$ instead of a general vector $\bf x$. In the notation above, we know that: $$T({\bf v}_j) = \sum_{i=1}^m a_{ij}{\bf w}_i, \quad S({\bf v}_j) = \sum_{i=1}^m b_{ij}{\bf w}_i, \quad (T+\lambda S)({\bf v}_j) = \sum_{i=1}^m c_{ij}{\bf w}_i.$$ Going from the last identity: $$\begin{align}\sum_{i=1}^m c_{ij}{\bf w}_i &= (T+\lambda S)({\bf v}_j) \\ &= T{\bf v}_j + \lambda S{\bf v}_j \\ &= \sum_{i=1}^ma_{ij}{\bf w}_i + \lambda \sum_{i=1}^mb_{ij}{\bf w}_i \\ &= \sum_{i=1}^ma_{ij}{\bf w}_i + \sum_{i=1}^m\lambda b_{ij}{\bf w}_i \\&= \sum_{i=1}^m(a_{ij}+\lambda b_{ij}){\bf w}_i \end{align}$$
Since $\{ {\bf w}_i \}_{i=1}^m$ is linearly independent, $c_{ij} = a_{ij}+\lambda b_{ij}$. Hence $[T+\lambda S]_\beta^\gamma = [T]_\beta^\gamma+\lambda[S]_\beta^\gamma.$