Difference Between Tensoring and Wedging.
Let $V$ be a vector space and $\omega\in \otimes^k V$. There are $2$ ways (at least) of thinking about $\omega\otimes \omega$.
1) We may think of $\otimes^k V$ as a vector space $W$, and $\omega\otimes \omega$ as a member of $W\otimes W$.
2) We may think of $\omega\otimes \omega$ as a member of $\otimes^{2k}V$.
The two interpretations are "same" because $W\otimes W$ is naturally isomorphic to $\otimes^{2k} V$.
However, the situation is a bit different when talking about "wedging".
Let $\eta\in \Lambda^k V$. We want to wonder about $\eta\wedge \eta$.
1) Let $W=\Lambda^k V$ and think of $\eta\wedge \eta$ as a member of $\Lambda^2 W$. Then $\eta\wedge \eta=0$ by super-commutativity of the wedge-product.
2) Think of $\eta\wedge\eta$ as a member of $\Lambda^{2k}V$. Then $\eta\wedge \eta$ may not be $0$.
Perhaps this confusion would not arise if we write $\wedge_V$ rather that $\wedge$, for when wedging we must remember the base space. Moreover, there is no such thing as taking the wedge product of two vector spaces, thought we can talk about tensor product of two vector spaces.
Admittedly, my mind is not completely clear here. Can somebody throw some more light on the different behaviours of tensoring and wedging.
Solution 1:
This is not really a direct answer but is just too long for a comment.
One curious fact about the wedge construction is that $\bigwedge^n V$ can be (functorially) realized either as a subspace of $\bigotimes^n V$ or as a quotient. (These realizations are canonically isomorphic when the characteristic of the underlying field is $0$ or greater than $n$, but otherwise are non-canonically isomorphic.)
Although the quotient construction tends to be more natural, it's often useful to think about the subspace construction. The little wedge symbol means different things, depending on your construction.
In the quotient construction, $\bigwedge^n V$ is the quotient of $\bigotimes^n V$ by the subspace generated by symbols with repeated vectors, and the symbol $v_1 \wedge \ldots \wedge v_n$ means "the image of $v_1 \otimes \ldots \otimes v_n$ under the quotient map."
In the subspace construction, on the other hand, $\bigwedge^n V$ is the subspace of $\bigotimes^n V$ on which $S_n$ acts via the sign character, and the symbol $v_1 \wedge \ldots \wedge v_n$ means either (depending on your convention and on whether $n! = 0$ in your field) $$ \sum_{\sigma \in S_n} (-1)^{\text{sign}(\sigma)} v_{\sigma(1)}\otimes \ldots \otimes v_{\sigma(n)}$$ or $$ \frac{1}{n!} \sum_{\sigma \in S_n} (-1)^{\text{sign}(\sigma)} v_{\sigma(1)}\otimes \ldots \otimes v_{\sigma(n)}. $$ (The second convention has the advantage that under the natural map "subspace intepretation to quotient interpretation" is compatible with the notation, and the disadvantage that it's only available in characteristic prime to $n!$).
Now let's think about $\bigwedge^2 \bigwedge^n V$ versus $\bigwedge^{2n} V$ in terms of the subspace interpretation. The former is the subspace of $\bigotimes^2 \bigwedge^n V$ on which $S_2$ acts by the sign character, which is the subspace of $\bigotimes^2 \bigotimes^n V$ on which $S_2$ and $S_n$ act independently via the sign character. Under the natural "unravelling map" $$ \bigotimes^2(\bigotimes^n V) \to \bigotimes^{2n} V $$ we get the subspace of $\bigotimes^{2n} V$ on which $(S_n \times S_n) \rtimes S_2 < S_{2n}$ acts via the product of the sign characters. But this is a weaker, and different, demand than demanding that all of $S_{2n}$ act via the sign character, and so this subspace is bigger.
(EDIT: see comments for more details.) In terms of representation theory, your observation could be restated as follows: Write $G = (S_n \times S_n) \rtimes S_2.$ Then there is a natural embedding $G \hookrightarrow S_{2n}$. Now for a field $k$ there is a unique character $G \to k^\times$ which restricts to the sign character each $S_n$ and to the sign character on $S_2$. There is also a character $G \to k^\times$ using the embedding $G \hookrightarrow S_{2n}$ followed by the sign character of $S_{2n}$. These characters aren't the same.