Properties of 4 by 4 Matrices

My impulse is to prove it by induction (on the number of factors in $W$). Clearly the result holds for any single-factor products. So, assuming that the result holds for $W$, show that $\det(AW+WA)\equiv \det(AW-WA)\mod 4$, and similarly for $B,C$, and $D$. Just doing the algebra by hand, I showed that it's true at least for $A$. I assume the others would work out somewhat similarly. At the very least, the calculation for $D$ would be symmetric.


Added 2014-07-14: The Answer below is a rewrite of the former text which was essentially just an approach to a possible solution. It was based upon the idea to interpret the matrices $A_2$ and $B_2$ as transformation matrices for an automaton. The language which is generated by this automaton could then be analysed to find the solution.

Contrary to my first thoughts this approach is more cumbersome than proving the proposition by induction. So, with less insight to the problem, but also with less effort to provide a solution the answer below is based upon induction.


This is an answer referring to the note of Grumpy Parsnip. So, we put the focus on the $2\times2$-matrices

\begin{align*} A_2=\left( \begin{array}{cc} x_1 & x_2\\ 0 & 1 \end{array} \right) \qquad B_2=\left( \begin{array}{cc} 1 & 0\\ x_3 & x_4 \end{array} \right) \end{align*}

$A_2$ and $B_2$ are upper left sub-matrices of $A$ and $B$ of the original question. We assume integer entries and consider finite products of the above matrices $$W=\prod_{j=1}^{n}X_j\qquad X_j\in\{A_2,B_2\}$$ We define $Re(W) := \prod_{j=1}^{n}X_{n+1-j}$, a kind of reversed product and show:

The following statement valid:

\begin{align*} \det\left(W+Re(W)\right)+\det\left(W-Re(W)\right) \equiv 0(4)\tag{1} \end{align*}

Observe that we add in (1) the determinants corresponding to the factor $(-1)^{m-1}$ for $(2m\times2m)$-matrices of Grumpy Parsnips note.


Note: To illustrate the connection between the sub-matrices $A_2$ and $B_2$ and the matrices $A,B,C,D$ of the original question, see the picture with the corresponding automata below. The elements $x_{i,j}$ of a matrix are the labels of the edges from node $i$ to node $j$ of the automata. A matrix entry $1$ which is the neutral element with respect to multiplication is denoted with $\varepsilon$, the neutral element of concatenation of words of the formal language which are generated when walking along the edges of the automaton.

The automata clearly indicate the structural connection of the $(2\times 2)$ matrices $A_2,B_2$ and $(4\times 4)$ matrices $A,B,C,D$ making Grumpy Parsnips remark about generalisation of the matrices to $(2m\times 2m)$ matrices plausible, besides the factor $(-1)^{m-1}$ which was presumably a result of separate calculation.

enter image description here

[automata of transition matrices $A_2, B_2$ and $A,B,C,D$]


The following proof is done by induction on the number $n$ of factors of $W$

Induction base step $(n=1)$

In case $n=1$ we have to check two alternatives $W=A_2$ and $W=B_2$

Case $W=A_2$:

\begin{align*} \det&\left(A_2+Re(A_2)\right)+ \det\left(A_2-Re(A_2)\right)\\ &= \det(2\cdot A_2)+\det(0\cdot A_2)\\ &=2^2\det(A_2)\equiv 0(4) \end{align*}

Observe, that $Re(W)=Re(A_2)=A_2$.

Since the case $W=B_2$ is literally the same as $W=A_2$ the base step is proved.

Next, the

Induction hypotheses

We assume the statement $(1)$ is valid foreach $W=\prod_{j=1}^{n}X_j$ with $X_j\in\{A_2,B_2\}$.

So, let

\begin{align*} W=\left( \begin{array}{cc} w_{1,1} & w_{1,2}\\ w_{2,1} & w_{2,2} \end{array} \right) \qquad Re(W)=\left( \begin{array}{cc} w_{1,1}^\ast & w_{1,2}^\ast\\ w_{2,1}^\ast & w_{2,2}^\ast \end{array} \right) \end{align*}

Then we get

\begin{align*} \det&\left(W+Re(W)\right)+\det\left(W-Re(W)\right)\\ &=det\left( \begin{array}{cc} w_{1,1}+w_{1,1}^\ast & w_{1,2}+w_{1,2}^\ast\\ w_{2,1}+w_{2,1}^\ast & w_{2,2}+w_{2,2}^\ast \end{array} \right) +\det\left( \begin{array}{cc} w_{1,1}-w_{1,1}^\ast & w_{1,2}-w_{1,2}^\ast\\ w_{2,1}-w_{2,1}^\ast & w_{2,2}-w_{2,2}^\ast \end{array} \right)\\ &=(w_{1,1}+w_{1,1}^\ast)(w_{2,2}+w_{2,2}^\ast)-(w_{2,1}+w_{2,1}^\ast)(w_{1,2}+w_{1,2}^\ast)\\ &+(w_{1,1}-w_{1,1}^\ast)(w_{2,2}-w_{2,2}^\ast)-(w_{2,1}-w_{2,1}^\ast)(w_{1,2}-w_{1,2}^\ast)\\ &=2w_{1,1} w_{2,2}+2w_{1,1}^\ast w_{2,2}^\ast-2w_{1,2}w_{2,1}-2w_{1,2}^\ast w_{2,1}^\ast\\ &\equiv 0(4)\tag{2} \end{align*}

And now we show the

Induction step $(n \rightarrow n+1)$

We have to show that $(1)$ is valid for $WA_2$ and $WB_2$. It's sufficient to consider right multiplication with $A_2$ resp. $B_2$ since left multiplication is already subsumed by a proper choice of $W$.

Case $WA_2$:

\begin{align*} WA_2&=\left( \begin{array}{cc} w_{1,1} & w_{1,2}\\ w_{2,1} & w_{2,2} \end{array} \right)\left( \begin{array}{cc} x_1 & x_2\\ 0 & 1 \end{array} \right)\\ &=\left( \begin{array}{cc} w_{1,1}x_1 & w_{1,1}x_2+w_{1,2}\\ w_{2,1}x_1 & w_{2,1}x_2+w_{2,2} \end{array} \right)\\ A_2Re(W)&=\left( \begin{array}{cc} x_1 & x_2\\ 0 & 1 \end{array} \right)\left( \begin{array}{cc} w_{1,1}^\ast & w_{1,2}^\ast\\ w_{2,1}^\ast & w_{2,2}^\ast \end{array} \right)\\ &=\left( \begin{array}{cc} w_{1,1}^\ast x_1+w_{2,1}^\ast x_2 & w_{1,2}^\ast x_1+w_{2,2}^\ast x_2\\ w_{2,1}^\ast & w_{2,2}^\ast \end{array} \right) \end{align*}

So, we get

\begin{align*} \det&\left(WA_2+A_2Re(W)\right)+\det\left(WA_2-A_2Re(W)\right)\\ &=det\left( \begin{array}{cc} w_{1,1}x_1+w_{1,1}^\ast x_1+w_{2,1}^\ast x_2 & w_{1,1}x_2+w_{1,2}+w_{1,2}^\ast x_1+w_{2,2}^\ast x_2\\ w_{2,1}x_1+w_{2,1}^\ast & w_{2,1}x_2+w_{2,2}+w_{2,2}^\ast \end{array} \right)\\ &+det\left( \begin{array}{cc} w_{1,1}x_1-w_{1,1}^\ast x_1-w_{2,1}^\ast x_2 & w_{1,1}x_2+w_{1,2}-w_{1,2}^\ast x_1-w_{2,2}^\ast x_2\\ w_{2,1}x_1-w_{2,1}^\ast & w_{2,1}x_2+w_{2,2}-w_{2,2}^\ast \end{array} \right)\\ &=(w_{1,1}x_1+w_{1,1}^\ast x_1+w_{2,1}^\ast x_2)(w_{2,1}x_2+w_{2,2}+w_{2,2}^\ast)\\ &-(w_{2,1}x_1+w_{2,1}^\ast)(w_{1,1}x_2+w_{1,2}+w_{1,2}^\ast x_1+w_{2,2}^\ast x_2)\\ &+(w_{1,1}x_1-w_{1,1}^\ast x_1-w_{2,1}^\ast x_2)(w_{2,1}x_2+w_{2,2}-w_{2,2}^\ast)\\ &-(w_{2,1}x_1-w_{2,1}^\ast)(w_{1,1}x_2+w_{1,2}-w_{1,2}^\ast x_1-w_{2,2}^\ast x_2)\\ &=(2w_{1,1} w_{2,2}+2w_{1,1}^\ast w_{2,2}^\ast-2w_{1,2}w_{2,1}-2w_{1,2}^\ast w_{2,1}^\ast)x_1\\ &\equiv 0(4) \end{align*}

according to the induction hypothesis $(2)$.

Case $WB_2$:

\begin{align*} WB_2&=\left( \begin{array}{cc} w_{1,1} & w_{1,2}\\ w_{2,1} & w_{2,2} \end{array} \right)\left( \begin{array}{cc} 1 & 0\\ x_3 & x_4 \end{array} \right)\\ &=\left( \begin{array}{cc} w_{1,1}+w_{1,2}x_3 & w_{1,2}x_4\\ w_{2,1}+w_{2,2}x_3 & w_{2,2}x_4 \end{array} \right)\\ B_2Re(W)&=\left( \begin{array}{cc} 1 & 0\\ x_3 & x_4 \end{array} \right)\left( \begin{array}{cc} w_{1,1}^\ast & w_{1,2}^\ast\\ w_{2,1}^\ast & w_{2,2}^\ast \end{array} \right)\\ &=\left( \begin{array}{cc} w_{1,1}^\ast & w_{1,2}^\ast\\ w_{1,1}^\ast x_3+w_{2,1}^\ast x_4 & w_{1,2}^\ast x_3+w_{2,2}^\ast x_4 \end{array} \right) \end{align*}

So, we get

\begin{align*} \det&\left(WB_2+B_2Re(W)\right)+\det\left(WB_2-B_2Re(W)\right)\\ &=det\left( \begin{array}{cc} w_{1,1}+w_{1,2}x_3+w_{1,1}^\ast & w_{1,2}x_4+w_{1,2}^\ast\\ w_{2,1}+w_{2,2}x_3+w_{1,1}^\ast x_3+w_{2,1}^\ast x_4 & w_{2,2}x_4+w_{1,2}^\ast x_3+w_{2,2}^\ast x_4 \end{array} \right)\\ &+det\left( \begin{array}{cc} w_{1,1}+w_{1,2}x_3-w_{1,1}^\ast & w_{1,2}x_4-w_{1,2}^\ast\\ w_{2,1}+w_{2,2}x_3-w_{1,1}^\ast x_3-w_{2,1}^\ast x_4 & w_{2,2}x_4-w_{1,2}^\ast x_3-w_{2,2}^\ast x_4 \end{array} \right)\\ &=(w_{1,1}+w_{1,2}x_3+w_{1,1}^\ast)(w_{2,2}x_4+w_{1,2}^\ast x_3+w_{2,2}^\ast x_4)\\ &-(w_{2,1}+w_{2,2}x_3+w_{1,1}^\ast x_3+w_{2,1}^\ast x_4)(w_{1,2}x_4+w_{1,2}^\ast)\\ &+(w_{1,1}+w_{1,2}x_3-w_{1,1}^\ast)(w_{2,2}x_4-w_{1,2}^\ast x_3-w_{2,2}^\ast x_4)\\ &-(w_{2,1}+w_{2,2}x_3-w_{1,1}^\ast x_3-w_{2,1}^\ast x_4)(w_{1,2}x_4-w_{1,2}^\ast)\\ &=(2w_{1,1} w_{2,2}+2w_{1,1}^\ast w_{2,2}^\ast-2w_{1,2}w_{2,1}-2w_{1,2}^\ast w_{2,1}^\ast)x_4\\ &\equiv 0(4) \end{align*}

according to the induction hypothesis $(2)$ which completes the proof by induction.


Note: A proof by induction of the original answer could be done in the same way (with considerably more effort :-) )