Determinant of a block upper triangular matrix [duplicate]

How prove the following equality for a block matrix?

$$\det\left[\begin{array}[cc]\\A&C\\ 0&B\end{array}\right]=\det(A)\det(B)$$

I tried to use a proof by induction but I'm stuck. Is there a simpler method? Thanks for help.


Solution 1:

Other answers suggest quite elementary proofs, and I upvoted one of them. However, I want to propose a technically easier, but less elementary proof.

If you're familiar with it, you can use QR decomposition. Let

$$A = Q_A R_A, \quad B = Q_B R_B$$

be QR decompositions of $A$ and $B$. Then

\begin{align*} \det \begin{bmatrix} A & C \\ 0 & B \end{bmatrix} &= \det \begin{bmatrix} Q_A R_A & Q_A Q_A^T C \\ 0 & Q_B R_B \end{bmatrix} = \det \left( \begin{bmatrix} Q_A \\ & Q_B \end{bmatrix} \begin{bmatrix} R_A & Q_A^T C \\ 0 & R_B \end{bmatrix} \right) \\ &= \det \begin{bmatrix} Q_A \\ & Q_B \end{bmatrix} \det \begin{bmatrix} R_A & Q_A^T C \\ 0 & R_B \end{bmatrix} = \det Q \det R, \end{align*}

where

$$Q := \begin{bmatrix} Q_A \\ & Q_B \end{bmatrix}, \quad R := \begin{bmatrix} R_A & Q_A^T C \\ 0 & R_B \end{bmatrix}.$$

Notice that $R$ is (upper) triangular, so its determinant is equal to the product of its diagonal elements, so

$$\det R = \det \begin{bmatrix} R_A & 0 \\ 0 & R_B \end{bmatrix}.$$

Combining what we have,

\begin{align*} \det \begin{bmatrix} A & C \\ 0 & B \end{bmatrix} &= \det Q \det R = \det \begin{bmatrix} Q_A \\ & Q_B \end{bmatrix} \det \begin{bmatrix} R_A \\ & R_B \end{bmatrix} \\ &= \det Q_A \det Q_B \det A \det B = \det (Q_AR_A) \det (Q_B R_B) \\ &= \det A \det B. \end{align*}

Notice that this is far from elementary proof. It uses the QR decomposition, a formula for the determinant of block diagonal matrices, a formula for the determinant of triangular matrices, and block multiplication of matrices.

Solution 2:

Since I have criticised some other answers, let me be fair and give my take on this. The result to use is just the Leibniz formula defining the determinant (for once, use the definition!): $$ \def\sg{\operatorname{sgn}} \det(M) = \sum_{\sigma \in S_n} \sg(\sigma) \prod_{i = 1}^n M_{i,\sigma(i)}. $$ Now if $M$ is the matrix of the question, and its block $A$ has size $k\times k$, then by the block form $M_{i,j}=0$ whenever $j\leq k<i$ (lower left hand block). So we can drop all permutations from the sum for which $\sigma(i)\leq k$ for any $i\in\{k+1,\ldots,n\}$. But that means that for $\sigma$ to survive, the $n-k$ values $\sigma(i)$ for are $i\in T=\{k+1,\ldots,n\}$ must all be in the $n-k$-element set $T$, and they must of course also be distinct: they form a permutation of the elements of$~T$. So $\sigma$ permutes them elements of $T$ among each other, and then necessarily also the elements of the complement $\{1,\ldots,k\}$ of$~T$. These permutations of subsets are independent, so the surviving permutations are naturally in bijection with the Cartesian product of the permutations of $\{1,\ldots,k\}$ and those of$~T$. Also the sign of the combination of two such permutations is the product of the signs of the individual permutations. All in all one gets $$ \begin{align} \det(M)&=\sum_{\pi \in S_k} \sum_{\tau\in S_T} \sg(\pi)\sg(\tau) \left(\prod_{i = 1}^k M_{i,\pi(i)}\right)\prod_{i \in T}M_{i,\tau(i)} \\&=\left(\sum_{\pi \in S_k}\sg(\pi)\prod_{i = 1}^k M_{i,\pi(i)}\right)\left(\sum_{\tau\in S_T}\sg(\tau)\prod_{i \in T}M_{i,\tau(i)}\right) \\&=\det (A)\det(B). \end{align} $$

Solution 3:

Hint: Use Laplace expansion to get what you need.