Determinant of a block lower triangular matrix

Solution 1:

If $A$ is singular, its rows are linearly dependent, hence the rows of the entire matrix are linearly dependent, hence boths sides of the equation vanish.

If $A$ is not singular, we have

$$\pmatrix{I&0\\-CA^{-1}&I}\pmatrix{A&0\\C&D}=\pmatrix{A&0\\0&D}\;.$$

The determinants of the two new matrices are perhaps easier to derive from the Laplace expansion than that of the entire matrix. They are $1$ and $\det A \det D$, respectively, and the result follows.

Another way to express this is that if $A$ is not singular you can get rid of $C$ by Gaussian elimination.

Solution 2:

As @user153012 is asking for a proof in full detail, here is a brute-force approach using an explicit expression of a determinant of an $n$ by $n$ matrix, say $A = (a[i,j])$, $$\det A = \sum_{\sigma\in S_n}\operatorname{sgn}\sigma \prod_i a[{i,\sigma(i)}],$$ where $S_n$ is the symmetric group on $[n] = \{1,\dots, n\}$ and $\operatorname{sgn}\sigma$ denotes the signature of $\sigma$.

In matrix $$B = \left(\begin{array}{cc} A&0\\ C&D \end{array}\right),$$ we have $$b[i,j] = \begin{cases}a[i,j] & \text{if }i \le k, j \le k;\\ d[i-k, j-k] & \text{if }i > k, j > k; \\ 0 & \text{if }i \le k, j > k; \\ c[i-k,j] & \text{otherwise}.\end{cases}$$ Observe in $$\det B = \sum_{\sigma\in S_{n+k}}\operatorname{sgn}\sigma\prod_i b[i, \sigma(i)],$$ if $\sigma(i) = j$ such that $i\le k$ and $j > k$, then the corresponding summand $\prod_i b[i,\sigma(i)]$ is $0$. Any permutation $\sigma\in S_{n+k}$ for which no such $i$ and $j$ exist can be uniquely "decomposed" into two permutations, $\pi$ and $\tau$, where $\pi\in S_k$ and $\tau\in S_n$ such that $\sigma(i) = \pi(i)$ for $i \le k$ and $\sigma(k+i) = k+\tau(i)$ for $i \le n$. Moreover, we have $\operatorname{sgn}\sigma = \operatorname{sgn}\pi\operatorname{sgn}\tau$. Denote the collection of such permutations by $S_n'$. Therefore, we can write $$\begin{eqnarray}\det B &=& \sum_{\sigma\in S_n'}\operatorname{sgn}\sigma\prod_{i=1}^k b[i,\sigma(i)]\prod_{i=k+1}^{k+n} b[i,\sigma(i)] \\ &=& \sum_{\sigma\in S_n'}\operatorname{sgn}\sigma\prod_{i=1}^k a[i,\sigma(i)]\prod_{i=1}^nd[i,\sigma(i+k)-k] \\ & = & \sum_{\pi\in S_k,\tau\in S_n}\operatorname{sgn}\pi\operatorname{sgn}\tau\prod_{i=1}^k a[i,\pi(i)]\prod_{i=1}^nd[i,\tau(i)] \\ &=& \sum_{\pi\in S_k}\operatorname{sgn}\pi\prod_{i=1}^k a[i,\pi(i)]\sum_{\tau\in S_{n}}\operatorname{sgn}\tau\prod_{i=1}^nd[i,\tau(i)] \\ & = & \det A\det D.\end{eqnarray}$$ QED.

Update As @Marc van Leeuwen mentioned in the comment, a similar formula holds for permanents.The proof is basically the same as the proof for determinant except one has to drop off all those signatures of permutations.

Solution 3:

This is a fundamental result about determinants, and like most of such results it holds for matrices with entries in any commutative (unitary) ring. It is therefore good to have a proof that does not rely on the coefficients being in a field; I will use the Leibniz formula as definition of the determinant rather than a characterisation as alternating $n$-linear form. (And my apologies to those who find that distasteful; for some purposes using the definition is really best.)

Writing for a permutation $\sigma$ and a matrix $M$ the abbreviation $\def\sg{\operatorname{sg}}M[\sigma]=\sg(\sigma)\prod_iM_{i,\sigma(i)}$, the Leibniz formula says, for any $m\times m$ matrix $M$ $$ \det(M)=\sum_{\sigma\in S_m}M[\sigma] $$ The result is based on the following simple fact about symmetric groups

The subgroup of $S_{k+n}$ of permutations permuting the first $k$ elements among each other is canonically isomorphic to $S_k\times S_n$, and if $\sigma\in S_{k+n}$ corresponds to $(\pi,\rho)\in S_k\times S_n$ then $\sg(\sigma)=\sg(\pi)\sg(\rho)$.

In fact this is basically just saying that if the first $k$ elements are permuted among each other, then so are the remaining $n$ elements, and the sign of the whole permutation is the product of the signs of its restrictions to those two subsets.

Now if $M=\bigl(\begin{smallmatrix}A&0\\C&D\end{smallmatrix}\bigr)$ note that $M[\sigma]=0$ unless $\sigma$ permutes first $k$ indices among each other. From this it follows that $$ \det(M)=\sum_{\sigma\in S_{k+n}}M[\sigma] =\sum_{(\pi,\rho)\in S_k\times S_n}A[\pi]D[\rho] =\left(\sum_{\pi\in S_k}A[\pi]\right)\left(\sum_{\rho\in S_n}D[\rho]\right) =\det A\det D. $$


Alternatively, if one is willing to assume the property $\det(MN)=\det(M)\det(N)$ (not really easier than the one to be proved here, but maybe better known), and if one considers the special cases where either $A=I$ or $D=I$ to be clear (because one can obtain the result in these cases by repeated Laplace expansion by rows respectively by columns), then one can employ any decomposition of the form $$ \pmatrix{A&0\\C&D} = \pmatrix{A&0\\C_0&I} \pmatrix{I&0\\C_1&D} \qquad\text{with $C=C_0+C_1$} $$ (for instance with one of $C_0,C_1$ equal to zero) to obtain the desired result.