Why is $\pi_r(L)$ a linear transformation into $\Lambda^r(V)$

Solution 1:

To conclude that a multilinear form is alternating because permuting the arguments by a permutation$~\sigma$ multiplies its values by $\operatorname{sgn} \sigma$ is not valid in contexts where the scalar $2$ is not necessarily invertible (for instance when discussing vector spaces over a field that might be of characteristic$~2$). Basically in characteristic$~2$ that condition means being symmetric, which is (there) strictly weaker than being alternating, with an example showing the difference already arising for bilinear forms in dimension$~1$ (where an alternating bilinear form is necessarily$~0$, but a symmetric one is not).

Solution 2:

For the sake of completeness, I would like to add a proof of this lemma that does not assume that $K$ is a ring in which $1 + 1 \neq 0$.

Lemma. $\pi_r$ is a linear transformation from $M^r(V)$ into $\Lambda^r(V)$. If $L$ is in $\Lambda^r(V)$ then $\pi_r L = r! L$.

Proof. It is clear that $\pi_r$ is a linear transformation from $M^r(V)$ into $M^r(V)$. Let $(\alpha_1,\dots,\alpha_r) \in V^r$ and suppose that $\alpha_i = \alpha_j$ for some $1 \leq i < j \leq r$. We will show that $\pi_r L(\alpha_1,\dots,\alpha_r) = 0$, proving that $\pi_r L$ is alternating.

We pair the permutations of $\{ 1,\dots,r \}$ in a manner such that if $\sigma$ and $\tau$ are paired, then $$(\operatorname{sgn}{\sigma})L(\alpha_{\sigma 1},\dots,\alpha_{\sigma r}) + (\operatorname{sgn}{\tau})L(\alpha_{\tau 1},\dots,\alpha_{\tau r}) = 0.$$ The pairing is as follows: pair $\sigma$ with $\tau = (i,j) \circ \sigma$. This is well-defined because by this rule $\tau$ is paired with $$(i,j) \circ \tau = (i,j) \circ (i,j) \circ \sigma = \sigma,$$ and $\sigma \neq \tau$ because $\operatorname{sgn}{\sigma} =-\operatorname{sgn}{\tau}$.

If $\sigma^{-1} i < \sigma^{-1} j$, we get $$ \begin{align} (\operatorname{sgn}{\tau})L(\alpha_{\tau 1},\dots,\alpha_{\tau r}) &= -(\operatorname{sgn}{\sigma})L(\alpha_{(i,j) \circ \sigma 1}, \dots, \alpha_{(i,j) i}, \dots, \alpha_{(i,j) j}, \dots, \alpha_{(i,j) \circ \sigma r})\\ &= -(\operatorname{sgn}{\sigma}) L(\alpha_{\sigma 1}, \dots, \alpha_{j}, \dots, \alpha_{i}, \dots, \alpha_{\sigma r}) \\ &= -(\operatorname{sgn}{\sigma}) L(\alpha_{\sigma 1}, \dots, \alpha_{i}, \dots, \alpha_{j}, \dots, \alpha_{\sigma r}) &&(\because \alpha_i = \alpha_j)\\ &= -(\operatorname{sgn}{\sigma})L(\alpha_{\sigma 1},\dots,\alpha_{\sigma r}) \end{align} $$ If $\sigma^{-1} j < \sigma^{-1} i$, we get $$ \begin{align} (\operatorname{sgn}{\tau})L(\alpha_{\tau 1},\dots,\alpha_{\tau r}) &= -(\operatorname{sgn}{\sigma})L(\alpha_{(i,j) \circ \sigma 1}, \dots, \alpha_{(i,j) j}, \dots, \alpha_{(i,j) i}, \dots, \alpha_{(i,j) \circ \sigma r})\\ &= -(\operatorname{sgn}{\sigma}) L(\alpha_{\sigma 1}, \dots, \alpha_{i}, \dots, \alpha_{j}, \dots, \alpha_{\sigma r}) \\ &= -(\operatorname{sgn}{\sigma}) L(\alpha_{\sigma 1}, \dots, \alpha_{j}, \dots, \alpha_{i}, \dots, \alpha_{\sigma r}) &&(\because \alpha_i = \alpha_j)\\ &= -(\operatorname{sgn}{\sigma})L(\alpha_{\sigma 1},\dots,\alpha_{\sigma r}) \end{align} $$ In either case, $$(\operatorname{sgn}{\sigma})L(\alpha_{\sigma 1},\dots,\alpha_{\sigma r}) + (\operatorname{sgn}{\tau})L(\alpha_{\tau 1},\dots,\alpha_{\tau r}) = 0.$$ Therefore, $$ \pi_r L(\alpha_1,\dots,\alpha_r) = \sum_{\sigma} (\operatorname{sgn}{\sigma})L(\alpha_{\sigma 1},\dots,\alpha_{\sigma r}) = 0. $$ So, $\pi_r$ is a linear transformation from $M^r(V)$ into $\Lambda^r(V)$.

If $L$ is in $\Lambda^r(V)$, then $L(\alpha_{\sigma 1},\dots,\alpha_{\sigma r}) = (\operatorname{sgn} \sigma) L(\alpha_1,\dots,\alpha_r)$ for each $\sigma$; hence $\pi_r L= r! L$.