Motivation to understand double dual space

I am helping my brother with Linear Algebra. I am not able to motivate him to understand what double dual space is. Is there a nice way of explaining the concept? Thanks for your advices, examples and theories.


Solution 1:

If $V$ is a finite dimensional vector space over, say, $\mathbb{R}$, the dual of $V$ is the set of linear maps to $\mathbb{R}$. This is a vector space because it makes sense to add functions $(\phi + \psi)(v) = \phi(v) + \psi(v)$ and multiply them by scalars $(\lambda\phi)(v) = \lambda(\phi(v))$ and these two operations satisfy all the usual axioms.

If $V$ has dimension $n$, then the dual of $V$, which is often written $V^\vee$ or $V^*$, also has dimension $n$. Proof: pick a basis for $V$, say $e_1, \ldots, e_n$. Then for each $i$ there is a unique linear function $\phi_i$ such that $\phi_i(e_i) = 1$ and $\phi_i(e_j) = 0$ whenever $i \neq j$. It's a good exercise to see that these maps $\phi_i$ are linearly independent and span $V^*$.

So given a basis for $V$ we have a way to get a basis for $V^*$. It's true that $V$ and $V^*$ are isomorphic, but the isomorphism depends on the choice of basis (check this by seeing what happens if you change the basis).

Now let's talk about the double dual, $V^{**}$. First, what does it mean? Well, it means what it says. After all, $V^*$ is a vector space, so it makes sense to take its dual. An element of $V^{**}$ is a function that eats elements of $V^*$, i.e. a function that eats functions that eat elements of $V$. This can be a little hard to grasp the first few times you see it. I will use capital Greek letters for elements of $V^{**}$.

Now, here is the trippy thing. Let $v \in V$. I am going to build an element $\Phi_v$ of $V^{**}$. An element of $V^{**}$ should be a function that eats functions that eat vectors in $V$ and returns a number. So we are going to set $$ \Phi_v(f) = f(v). $$

You should check that the association $v \mapsto \Phi_v$ is linear (so $\Phi_{\lambda v} = \lambda\Phi_v$ and $\Phi_{v + w} = \Phi_v + \Phi_w$) and is an isomorphism (one-to-one and onto)! This isomorphism didn't depend on choosing a basis, so there's a sense in which $V$ and $V^{**}$ have more in common than $V$ and $V^*$ do.

In fancier language, $V$ and $V^*$ are isomorphic, but not naturally isomorphic (you have to make a choice of basis); $V$ and $V^{**}$ are naturally isomorphic.

Final remark: someone will surely have already said this by the time I've edited and submitted this post, but when $V$ is infinite dimensional, it's not always true anymore that $V = V^{**}$. The map $v \mapsto \Phi_v$ is injective, but not necessarily surjective, in this case.

Solution 2:

Actually it's quite simple: If you have a vector space, any vector space, you can define linear functions on that space. The set of all those functions is the dual space of the vector space. The important point here is that it doesn't matter what this original vector space is. You have a vector space $V$, you have a corresponding dual $V^*$.

OK, now you have linear functions. Now if you add two linear functions, you get again a linear function. Also if you multiply a linear function with a factor, you get again a linear function. Indeed, you can check that linear functions fulfill all the vector space axioms this way. Or in short, the dual space is a vector space in its own right.

But if $V^*$ is a vector space, then it comes with everything a vector space comes with. But as we have seen in the beginning, one thing every vector space comes with is a dual space, the space of all linear functions on it. Therefore also the dual space $V^*$ has a corresponding dual space, $V^{**}$, which is called double dual space (because "dual space of the dual space" is a bit long).

So we have the dual space, but we also want to know what sort of functions are in that double dual space. Well, such a function takes a vector from $V^*$, that is, a linear function on $V$, and maps that to a scalar (that is, to a member of the field the vector space is based on). Now, if you have a linear function on $V$, you already know a way to get a scalar from that: Just apply it to a vector from $V$. Indeed, it is not hard to show that if you just choose an arbitrary fixed element $v\in V$, then the function $F_v\colon\phi\mapsto\phi(v)$ indeed is a linear function on $V^*$, and thus a member of the double dual $V^{**}$. That way we have not only identified certain members of $V^{**}$ but in addition a natural mapping from $V$ to $V^{**}$, namely $F\colon v\mapsto F_v$. It is not hard to prove that this mapping is linear and injective, so that the functions in $V^{**}$ corresponding to vectors in $V$ form a subspace of $V^{**}$. Indeed, if $V$ is finite dimensional, it's even all of $V^{**}$. That's easy to see if you know that $\dim(V^*)=\dim{V}$ and therefore $\dim(V^{**})=\dim{V^*}=\dim{V}$. On the other hand, since $F$ is injective, $\dim(F(V))=\dim(V)$. However for finite dimensional vector spaces, the only subspace of the same dimension as the full space is the full space itself. However if $V$ is infinite dimensional, $V^{**}$ is larger than $V$. In other words, there are functions in $V^{**}$ which are not of the form $F_v$ with $v\in V$.

Note that since $V^{**}$ again is a vector space, it also has a dual space, which again has a dual space, and so on. So in principle you have an infinite series of duals (although only for infinite vector spaces they are all different).

Solution 3:

This is an old post and the existing answers are good, but I feel like I can add another perspective. It looks fairly abstract at first, but if you're dealing with double duals, you just got to accept some abstraction. (By the way, I vividly remember solving the exercise in my undergrad linear algebra class showing the natural isomorphism $V \simeq V^{**}$. It felt like I had just warped my mind to the highest level of abstraction possible for human beings. The dual of a dual, woah ...)

Let $V$ be our vector space (over a field $K$, say). Let's assume we already understand the dual $V^*$. Then we will, after a short consideration, accept that the map

$$V^* \times V \rightarrow K$$ $$(l,v) \mapsto l(v)$$

is bilinear. We call it a bilinear form because its values are just scalars.

Now imagine we have any other vector space $W$ (over the same field) with a bilinear form

$$f:W \times V \rightarrow K.$$

We want to know how far this $W$ and $f$ are from the "ideal" form $V^* \times V \rightarrow K$ above. Note that for any such $W$ and $f$ we get a map, let's call it "c" for "comparison",

$$c: W \rightarrow V^*$$ $$w \mapsto f(w, \cdot)$$

(i.e. the image of $w$ is the linear form in $V^*$ which sends a given $v$ to $f(w,v)$). So how do $W$ and $V^*$ "compare"?

  • Easy exercise: $c$ is injective $\Leftrightarrow$ for all $0\neq w \in W$, there is $v\in V$ such that $f(w,v) \neq 0$
  • Trickier exercise: $c$ is surjective $\Leftrightarrow$ for all $0\neq v \in V$, there is $w\in W$ such that $f(w,v) \neq 0$.

Note that the "trickier" part needs that $\dim(V) < \infty$.

So the vector space $W$ and the bilinear form $f$ are "ideal" (more technical term: "perfect pairing") if and only if the bilinear form is non-degenerate (which just summarises the criteria on the RHS of the above exercises).

But now look at that non-degeneracy criterion: It is symmetric in the first and second component. In vague terms, $W$ is the dual of $V$ iff that form is non-degenerate; but if that form is non-degenerate, then so is the form that you just get from flipping the components; which means that then $V$ is also the dual of $W$. I.e. $V = V^{**}$.

This whole thinking of duality in terms of non-degenerate bilinear forms, as abstract as it seems on first encounter, becomes incredibly helpful in some applications. E.g. in representation theory it's very neat to translate all the time between self-duality of certain representations and existence of certain $G$-invariant forms on them. Or, a generalisation of this kind of thinking basically leads to adjunction of tensor products and Hom, and all that wonderful general algebra stuff, I recently used that e.g. here to motivate the definition of certain Lie algebra actions.