Action of a matrix on the exterior algebra

Solution 1:

Let me expand my comment into a complete answer.

Let $V$ be a finite-dimensional vector space, with dimension $n$ and basis $\left\{b_1,\ldots,b_n\right\}$. The $k^{\text{th}}$ exterior power $\Lambda^k(V)$ has dimension $n\choose k$, and the basis for $V$ induces a standard basis on $\Lambda^k(V)$ given by the collection of all (wedge) products of the form $b_{i_1}\wedge\cdots\wedge b_{i_k}$, where the $i_j$ are increasing, i.e. $1\leq i_j<i_{j+1}\leq k$.

Now here is the crucial point. A linear endomorphism at the "bottom" of the exterior algebra, $T:V\to V$, induces endomorphisms at all the higher degrees of the algebra, from $k=1$ up to $k=n$. In particular, $T$ induces a linear map $\Lambda^k(T):\Lambda^k(V)\to\Lambda^k(V)$. Which map? It suffices to specify how it acts on basis elements; it does so by sending $b_{i_1}\wedge\cdots\wedge b_{i_k}$ to $T(b_{i_1})\wedge\cdots\wedge T(b_{i_k})$.

You can write down the ${n\choose k}\times{n\choose k}$ matrix of $\Lambda^k(T)$ with respect to the induced basis on $\Lambda^k(V)$ by expanding the images $T(b_{i_1})\wedge\cdots\wedge T(b_{i_k})$. The coordinates of these images with respect to the basis elements $b_{i_1}\wedge\cdots\wedge b_{i_k}$ are precisely the familiar $k\times k$ minors of the matrix of $T$ with respect to the basis $\left\{b_1,\ldots,b_n\right\}$. In other words, the matrix of $\Lambda^k(T)$ encodes the minors of $T$. I won't bother to show this in detail because the notation is horrendous and unenlightening in general, but I will give a concrete example below -- it is instructive to do such an example once in full detail and then never do it again.

First, though, note that this construction shows there is nothing special about the third exterior power. Indeed, the most familiar induced map occurs at the top degree, $\Lambda^n(V)$, which is ${n\choose n}=1$ dimensional. So the induced map $\Lambda^n(T)$ is just multiplication by a scalar; this scalar is precisely the determinant of $T$.

Now for a concrete example. Take $V=\mathbb{R}^3$ with basis $\left\{e_1, e_2, e_3\right\}$. Let $T$ be a linear transformation $\mathbb{R}^3\to\mathbb{R}^3$ whose matrix with respect to this basis is

$$\begin{pmatrix}a^{11}&a^{12}&a^{13}\\a^{21}&a^{22}&a^{23}\\a^{31}&a^{32}&a^{33}\end{pmatrix}$$

This means $T(e_1)=a^{11}e_1+a^{21}e_2+a^{31}e_3$, etc.

Let's look at the map $\Lambda^2(T)$ that $T$ induces on the second exterior power $\Lambda^2(\mathbb{R}^3)$, which is ${3\choose 2}=3$ dimensional as well. (As a result, the matrix that represents $\Lambda^2(T)$ will also be $3\times 3$. This is a coincidence; the matrices of $T$ and of $\Lambda^k(T)$ will have the same size only when $k=1$ and $k=n-1$.) The induced basis elements of $\Lambda^2(\mathbb{R}^3)$ are $e_1\wedge e_2$, $e_1\wedge e_3$, and $e_2\wedge e_3$. In order to recover the familiar matrix of minors, I'll order this basis so that the first basis element is $e_2\wedge e_3$, the second is $e_1\wedge e_3$, and the third is $e_1\wedge e_2$.

Now let's compute the action of $\Lambda^2(T)$ on these basis elements in this order. The key to this algebra is the alternating nature of the wedge product. Here's the computation for the first basis element; we're effectively wedging the second and third columns of the matrix for $T$:

$$ \begin{eqnarray} \Lambda^2(T)(e_2\wedge e_3)&=&T(e_2)\wedge T(e_3)\\ &=&(a^{12}e_1+a^{22}e_2+a^{32}e_3)\wedge(a^{13}e_1+a^{23}e_2+a^{33}e_3)\\ &=&\left(\color{red}{a^{22}a^{33}-a^{32}a^{23}}\right)e_2\wedge e_3 +\left(\color{blue}{a^{12}a^{33}-a^{32}a^{13}}\right)e_1\wedge e_3 +\left(\color{orange}{a^{12}a^{23}-a^{22}a^{13}}\right)e_1\wedge e_2 \end{eqnarray} $$ From the second line to the third line, I'm using the fact that $e_i\wedge e_i=0$ and $e_i\wedge e_j=-e_j\wedge e_i$.

This calculation gives us the first column of the $3\times 3$ matrix representation of $\Lambda^2(T)$. You can verify at a glance that the red number, the $(1,1)$ entry of the matrix of $\Lambda^2(T)$, is precisely the $(1,1)$ minor of the matrix of $T$, i.e. the $2\times 2$ subdeterminant of $T$ we get when we delete the 1st row and 1st column:

$$\begin{pmatrix}a^{11}&a^{12}&a^{13}\\a^{21}&\color{red}{a^{22}}&\color{red}{a^{23}}\\a^{31}&\color{red}{a^{32}}&\color{red}{a^{33}}\end{pmatrix}$$

Similarly, the $(2,1)$ entry of the matrix of $\Lambda^2(T)$ (in blue) is precisely the $(2,1)$ minor, the subdeterminant when we delete the second row and first column:

$$\begin{pmatrix}a^{11}&\color{blue}{a^{12}}&\color{blue}{a^{13}}\\a^{21}&a^{22}&a^{23}\\a^{31}&\color{blue}{a^{32}}&\color{blue}{a^{33}}\end{pmatrix}$$

I trust you're now convinced that the $(3,1)$ entry (in orange) is precisely the $(3,1)$ minor:

$$\begin{pmatrix}a^{11}&\color{orange}{a^{12}}&\color{orange}{a^{13}}\\a^{21}&\color{orange}{a^{22}}&\color{orange}{a^{23}}\\a^{31}&a^{32}&a^{33}\end{pmatrix}$$

Let me end with one final remark. None of these constructions requires that $T$ be an endomorphism. Nor do they apply only to the exterior algebra. More generally, given two vector spaces $V$ and $W$, a linear map $T: V\to W$ induces homomorphisms at every degree of the tensor (not necessarily exterior) algebra.


References

Bourbaki, Algebra I: Chapters 1-3, Proposition 10, page 529.

MacLane and Birkhoff, Algebra, pages 563-564.