The set of all linear maps T:V->W is a vector space
I have to show that the set $L$ of all linear maps $T: V \to W$ is a vector space w.r.t the addition $$(T_1 + T_2)(\vec{v}) = T_1(\vec{v}) + T_2(\vec{v})$$
and scalar multiplication $$(xT)(\vec{v}) = xT(\vec{v})$$
such that $T_1,T_2,T \in L$, $\vec{v} \in V$, and $x \in \mathbb{R}$
I'm asked to detail all 10 properties required for $L$ to be a vector space as follows;
$ \forall \vec{u}, \vec{v}, \vec{w} \in V$ & $\forall x,y \in \mathbb{R}$
i) Closure: $$(T_1 + T_2)(\vec{v} + \vec{w}) = T_1(\vec{v} + \vec{w})) + T_2(\vec{v} + \vec{w}) \in L$$
ii) Associativity: $$(T_1 + T_2)(\vec{u} + \vec{v} + \vec{w}) =$$ $$((T_1 + T_2)(\vec{u}) + (T_1 + T_2)(\vec{v})) + (T_1 + T_2)(\vec{w}) =$$ $$ (T_1 + T_2)(\vec{u}) + ((T_1 + T_2)(\vec{v}) + (T_1 + T_2)(\vec{w}))$$
iii) Commutativity: $$(T_1 + T_2)(\vec{u} + \vec{v}) = (T_1 + T_2)(\vec{u}) + (T_1 + T_2)(\vec{v}) = (T_1 + T_2)(\vec{v}) + (T_1 + T_2)(\vec{u})$$
iv) Identity: $$(T_1 + T_2)(\vec{0}) = T_1(\vec{0}) + T_2(\vec{0}) = 0 *T_1(\vec{0}) + 0*T_2(\vec{0}) = \vec{0} + \vec{0} = \vec{0} \in L$$
v) Inverse: $$(T_1+T_2)(-\vec{v}) = T_1(-\vec{v}) + T_2(-\vec{v}) = -T_1(\vec{v}) - T_2(\vec{v}) = -(T_1(\vec{v}) + T_2(\vec{v})) =$$ $$-(T_1 + T_2)(\vec{v}) \in L$$
For scalar multiplication
vi) Closure: $$(xT)(\vec{v}) = xT(\vec{v}) \in L$$
vii) Compatibility: $$ (xT)(\vec{u} + \vec{v}) = xT(\vec{u} + \vec{v}) = x(T(\vec{u}) + T(\vec{v})) =$$ $$ xT(\vec{u}) + xT(\vec{v})$$
viii) Compatibility 2: $$((xy)T)(\vec{v}) = x(yT(\vec{v}))$$
ix) Distributivity: $$((x + y)T)(\vec{v}) = (x + y)T(\vec{v}) = xT(\vec{v}) + yT(\vec{v})$$
x) Identity: $$(1*T)(\vec{v}) = 1*T(\vec{v}) = T(\vec{v})$$
Have I done enough to show L is a vector space? Must I expand on the definitions for instance in ii I didn't fully apply the transformation b/c it would have been very time consuming.
What is the most efficient way to show this?
Solution 1:
You're not going to get much in the way of "efficient" here. Usually the point of these proofs is to demonstrate that you can spell out all the little steps in a cohesive manner. You need to learn to walk before you learn to run.
There are significant problems with your proof. Specifically, you're confusing the sum of two linear functions with summing their arguments (i.e. the vectors you substitute into them).
Let's start by explicitly defining the sum and scalar product of linear transformations. Suppose $T_1, T_2 : V \rightarrow W$ are linear, and $\lambda \in \mathbb{F}$ (the common scalar field for $V$ and $W$, usually $\mathbb{R}$ or $\mathbb{C}$ in undergraduate courses). Then,
\begin{align*} T_1 + T_2 &: V \rightarrow W : v \mapsto T_1(v) + T_2(v) \\ \lambda T_1 &: V \rightarrow W : v \mapsto \lambda T_1(v). \end{align*}
Notated more simply, we have for all $v \in V$,
\begin{align*} (T_1 + T_2)(v) = T_1(v) + T_2(v) \\ (\lambda T_1)(v) = \lambda T_1(v). \end{align*}
What you have to realise here is that the $+$ between $T_1$ and $T_2$ is a different operation to the $+$ between $T_1(v)$ and $T_2(v)$. The first plus is the addition operation that we're defining; the one that takes two linear maps $T_1, T_2$ and produces a third linear map $T_1 + T_2$. The second plus is the addition in $W$, between two vectors in $W$: $T_1(v)$ and $T_2(v)$. They are different operations on different vector spaces.
So, let's try proving, say, distributivity of scalar multiplication over addition. Suppose $T_1, T_2,$ and $\lambda$ are as above. We wish to show that $\lambda(T_1 + T_2) = \lambda T_1 + \lambda T_2$. In order to prove this, we must show that both functions, on the left and right, do the same thing to every vector $v$. So, we must show that, for all $v \in V$,
$$(\lambda(T_1 + T_2))(v) = (\lambda T_1 + \lambda T_2)(v)$$
Using only the definition of the sum of linear functions as described above, we have,
\begin{align*} (\lambda(T_1 + T_2))(v) &= \lambda((T_1 + T_2)(v)) & \text{(Definition of scalar multiplication on $L$)} \\ &= \lambda(T_1(v) + T_2(v)) & \text{(Definition of addition on $L$)} \\ &= \lambda T_1(v) + \lambda T_2(v) & \text{(Distributivity in $W$)} \\ (\lambda T_1 + \lambda T_2)(v) &= (\lambda T_1)(v) + (\lambda T_2)(v) & \text{(Definition of addition on $L$)} \\ &= \lambda T_1(v) + \lambda T_2(v) & \text{(Definition of scalar multiplication on $L$)} \end{align*} Therefore, $(\lambda(T_1 + T_2))(v) = (\lambda T_1 + \lambda T_2)(v)$ as needed, so $\lambda(T_1 + T_2) = \lambda T_1 + \lambda T_2$.
When showing closure of addition, you'll need to show that $T_1 + T_2$ is a linear function from $V$ to $W$. That is, you'll need to use the definition of $T_1 + T_2$ to show it's linear, specifically for all $v, w \in V$ and $\lambda \in \mathbb{F}$,
\begin{align*} (T_1 + T_2)(v + w) &= (T_1 + T_2)(v) + (T_1 + T_2)(w) \\ (T_1 + T_2)(\lambda v) &= \lambda(T_1 + T_2)(v) \end{align*}
Similarly for scalar multiplication. You should only need the definitions of addition in $L$ and basic properties of $V$ and $W$ guaranteed by $V$ and $W$ being vector spaces.
Also think about the additive identity and additive inverses. The additive identity must be a linear transformation from $V$ to $W$. What does it do to an arbitrary vector $v \in V$? These are things you need to make clear!
Good luck!
EDIT: There are still issues with the proof. Let's take a look at your proof of associativity, for example. Associativity of addition states that $(u + v) + w = u + (v + w)$, where $u, v, w$ are elements of the proposed vector space. The proposed vector space you're dealing with is $L$. You should be showing this is true for three linear transformations, $T_1, T_2, T_3$ from $V$ to $W$. You need to prove that $$(T_1 + T_2) + T_3 = T_1 + (T_2 + T_3).$$ In order to do this, you'll need to show, for an arbitrary $v \in V$, that $$((T_1 + T_2) + T_3)(v) = (T_1 + (T_2 + T_3))(v)$$ You should be able to show this only with the definition of addition on $L$ (as I described above) and associativity on $W$. I advise writing out reasons for the steps, much like I did with the distributivity. Let me get you started with the first step: $$((T_1 + T_2) + T_3)(v) = (T_1 + T_2)(v) + T_3(v) \ \ldots \ \text{Definition of addition on }L$$ With all of these axioms that require showing an equality of vectors (everything except closure and possibly additive identity and inverses), you'll only need one vector $v \in V$ to show equality (as you just need to show that when each transformation is applied to a given vector, it returns the same result). As soon as you introduce $u$ or $w$, you're proving the wrong thing.
I'll get to closure in a moment, but let's look at the additive identity. Remember, you want to find an element of $L$, a linear transformation from $V$ to $W$, that adds to other transformations in $L$ without changing them. I'll do the additive identity:
Let $\mathbf{0} : V \rightarrow W : v \mapsto 0_W$, where $0_W \in W$ is the additive identity of $W$. So $\mathbf{0}$ is a constant function, taking every $v \in V$ and returning $0_W$. We should probably verify that this function I've made up is really in $L$. It obviously maps from $V$ to $W$, but is it linear? Let's show that now.
Suppose $u, v \in V$ and $\lambda \in \mathbb{F}$. Then $$\mathbf{0}(u + v)= 0_W = 0_W + 0_W =\mathbf{0}(u) + \mathbf{0}(v)$$ $$\mathbf{0}(\lambda v) = 0_W = \lambda 0_W = \lambda \mathbf{0}(v).$$ Therefore, by definition, $\mathbf{0}$ is linear, and thus belongs to $L$.
Let's show $\mathbf{0}$ is indeed the additive identity. So, we must show $T + \mathbf{0} = T$. Again, we show this is true by applying both sides to an arbitrary vector $v \in V$, and verifying we get the same thing. We have, \begin{align*} (T + \mathbf{0})(v) &= T(v) + \mathbf{0}(v) & \text{(Definition of addition on $L$)} \\ &= T(v) + 0_W & \text{(Definition of $\mathbf{0}$)} \\ &= T(v) & \text{(Additive identity of $W$)} \end{align*} Thus, as required, $T + \mathbf{0} = T$. I'm hoping this is starting to click. We are only really interested in the addition and scalar multiplication operations on $L$, between linear transformations. We do need addition and scalar multiplication on $W$, but only to show the properties we want of addition and scalar multiplication on $L$.
Finally, I think I'll go through closure, as it's a bit of an exception. I'll do closure under scalar multiplication. Basically, given any $T \in L$ and $\lambda \in \mathbb{F}$, we need to show that $\lambda T$ is linear. Suppose $u, v \in V$ and $\mu \in \mathbb{F}$. We have, \begin{align*} (\lambda T)(u + v) &= \lambda T(u + v) & \text{(Definition of scalar multiplication on $L$)} \\ &= \lambda (T(u) + T(v)) & \text{($T$ is linear)} \\ &= \lambda T(u) + \lambda T(v) & \text{(Distributivity in $W$)} \\ &= (\lambda T)(u) + (\lambda T)(v) & \text{(Definition of scalar multiplication on $L$)} \\ (\lambda T)(\mu v) &= \lambda T(\mu v) & \text{(Definition of scalar multiplication on $L$)} \\ &= \lambda (\mu T(v)) & \text{($T$ is linear)} \\ &= (\lambda \mu) T(v) & \text{(Associativity of scalar multiplication on $W$)} \\ &= (\mu \lambda) T(v) & \text{(Commutativity of multiplication on $\mathbb{F}$)} \\ &= \mu (\lambda T(v)) & \text{(Associativity of scalar multiplication on $W$)} \\ &= \mu (\lambda T)(v) & \text{(Definition of scalar multiplication on $L$)} \end{align*}
These two properties mean that $\lambda T$ is linear, hence $\lambda T \in L$ as required.
I think my answer is long enough as it is. If you want to make further attempts, I'm happy to check them, but I advise making a new question, linking it back to this one, and send me a link to it, via a comment.