Upper Triangular Block Matrix Determinant by induction
We want to prove that: $$\det\begin{pmatrix}A & C \\ 0 & B\\ \end{pmatrix}= \det(A)\operatorname{det}(B),$$ where $A \in M_{m\times m}(R)$, $C \in M_{m\times n}(R)$,$B \in M_{n\times n}(R)$ and $R$ is a commutative ring with $1$.
$\textbf{SOLUTION}:$
Let $L = \begin{pmatrix}A & C \\ 0 & B\\ \end{pmatrix},$ then its clear that $L \in M_{(m+n)\times(m+n)}(R)$ and the matrix $L$ can be represented as follows: $$L = \begin{pmatrix}\begin{bmatrix} a_{11} &a_{12} &\ldots & a_{1m} \\ a_{21} &a_{22} &\ldots & a_{2m} \\ \vdots & \ddots & \ddots &\vdots \\ a_{11} &a_{12} &\dots & a_{mm} \\ \end{bmatrix} & \begin{bmatrix} c_{11} &c_{12} &\ldots & c_{1n} \\ c_{21} &c_{22} &\ldots & a_{2n} \\ \vdots & \ddots & \ddots &\vdots \\ c_{m1} &c_{m2} &\dots & c_{mn} \\ \end{bmatrix} \\ \\ \begin{bmatrix} 0 &0 &\ldots & 0 \\ 0 &0 &\ldots & 0 \\ \vdots & \ddots & \ddots &\vdots \\ 0 &0 &\dots & 0 \\ \end{bmatrix}& \begin{bmatrix} b_{11} &b_{12} &\ldots & b_{1n} \\ b_{21} &b_{22} &\ldots & b_{2n} \\ \vdots & \ddots & \ddots &\vdots \\ b_{n1} &b_{n2} &\dots & b_{nn} \\ \end{bmatrix}\\ \end{pmatrix}$$ Now, for each fixed $i \in \{1,2,\ldots,m,m+1,\ldots,m+n\}$ ($i$ is the row), We have:
$$\det(L) = \sum\limits_{j=1}^m (-1)^{i+j}\alpha_{ij}\det(L_{ij}) + \sum\limits_{j=m+1}^{m+n} (-1)^{i+j}\alpha_{ij}\det(L_{ij})$$ where:
$$ \ \alpha_{i,j} = \begin{cases} a_{i,j} & i,j = 1,\ldots , m \\ c_{i,j-m} & i = 1,\ldots , m & j = m+1, \ldots ,m+n \\ 0 & i= m+1, \ldots ,m+n & j = 1,\ldots , m \\ b_{i-m,j-m} & i = m+1,\ldots ,m+n & j = m+1, \ldots, m+n \end{cases} \\ $$ and $$ \\ L_{i,j} = \begin{cases} A_{i,j} & i,j = 1,\ldots , m \\ C_{i,j-m} & i = 1,\ldots , m & j = m+1, \ldots,m+n \\ 0 & i= m+1, \ldots ,m+n & j = 1,\ldots , m \\ B_{i-m,j-m} & i = m+1,\ldots ,m+n & j = m+1, \ldots ,m+n \end{cases} \\ $$
$\textbf{FIRST CASE}$
Choose $i > m$, then $i \in \{m+1,\ldots, m+n \}$. Now if $n = 0$, the problem is trivial, since $\det(B) = 1$ and therefore:$$\det(L) = \det(A)\det(B) = \det(A)1 = \det(A).$$ Then, lets assume that $$\det(L) = \det(A)\det(B),$$ holds for $n = k$.Then choose $i \in \{m+1,\ldots, m+k \}$, then $\forall j < m+1$, I get: $$(-1)^{i+j}\alpha_{ij}\det(L_{ij}) = 0,$$ Hence: $$\det(L) = \sum\limits_{j=m+1}^{m+n} (-1)^{i+j}\alpha_{ij}\det(L_{ij})$$ but $i \in \{m+1,m+2, \ldots, m+k\}$ and $j \in \{m+1,m+2, \ldots, m+k\}$ therefore: $$\det(L) = \sum\limits_{j=m+1}^{m+k} (-1)^{i+j}b_{ij}\det(B_{ij})$$ Notice that the sub-matrix $B_{ij}$ has the form: $\begin{pmatrix}A & C' \\ 0 & B'\\ \end{pmatrix}$. On the other hand, $B'$ have size smaller than $B$, therefore by our induction hypothesis, we have: $$\det\begin{pmatrix}A & C' \\ 0 & B'\\ \end{pmatrix} = \det(A)\det(B')$$
$ \textbf{SECOND CASE} $
On the other hand, Choose $i \leq m$, if $m = 0$, then the problem is trivial, since $\det(A) = 1$ and therefore: $$\det(L) = \det(A)\det(B) = \det(B) = \det(B).$$ Then, lets assume that $$\det(L) = \det(A)\det(B)$$ holds for $m = k$. Then choose $i \in \{1,\ldots, k \}$, then for all $j>k$, we have: $$(-1)^{i+j}\alpha_{ij}\det(L_{ij}) = 0,$$ therefore: $$\det(L) = \sum\limits_{j=1}^{k} (-1)^{i+j}\alpha_{ij}\det(L_{ij})$$ but $i \in \{1,2, \ldots, k\}$ and $j \in \{1,2, \ldots, k\}$, therefore: $$\det(L) = \sum\limits_{j=1}^k (-1)^{i+j}a_{ij}\det(A_{ij})$$
Now, the sub-matrix $A_{ij}$ has the form: $\begin{pmatrix}A' & C' \\ 0 & B\\ \end{pmatrix}$. On the other hand, $A'$ has size smaller than $A$, then by the induction hypothesis, we have: $$\det\begin{pmatrix}A' & C' \\ 0 & B\\ \end{pmatrix} = \det(A')\det(B) $$
I want to know if this is the right proof.
Solution 1:
Set $$ X=\begin{bmatrix}A & C\\0 & B\end{bmatrix} $$ If $A$ is not invertible, then its columns are linearly dependent, hence the first $m$ columns of $X$ are linearly dependent and also $X$ is not invertible. In this case the relation $\det X=\det A\det B$ is true. So we can assume $A$ is invertible; if Gaussian elimination on $A$ requires row switches, then collect all row switches in a permutation matrix $P$, so elimination on $PA$ can be done without row switches and $PA=LU$ where $L$ is lower triangular and $U$ is upper unitriangular. Consider the matrix $$P'=\begin{bmatrix}P & 0 \\ 0 & I_n\end{bmatrix}$$ so $$ P'X= \begin{bmatrix}P & 0 \\ 0 & I_n\end{bmatrix} \begin{bmatrix}A & C\\0 & B\end{bmatrix}= \begin{bmatrix}PA&PC\\0&B\end{bmatrix}= \begin{bmatrix}LU&PC\\0&B\end{bmatrix}= \begin{bmatrix}L & 0 \\ 0 & I_n\end{bmatrix} \begin{bmatrix}U & L^{-1}PC\\0 & B\end{bmatrix} $$ Now, as $$ \begin{bmatrix}L & 0 \\ 0 & I_n\end{bmatrix} $$ is lower triangular, we clearly have that its determinant is equal to $\det L$. Since $U$ is upper unitriangular, with $m$ times repeated Laplace development we get that $$ \det\begin{bmatrix}U & L^{-1}PC\\0 & B\end{bmatrix}=\det B $$ Therefore $$ \det P'X=\det P'\det X=\det L\det B $$ On the other hand, $P'$ is a permutation matrix generated by as many row swaps as $P$, so $\det P=\det P'$. Also $\det A=\det L\det U=\det L$.
Hence $\det X=\det A\det B$.
We can also do it by induction. For $i=1,2,\dots,m$, denote by $A_i$ the matrix obtained from $A$ by removing the first column and the $i$-th row; $C_i$ is the matrix obtained from $C$ by removing the $i$-th row.
The base case of induction is for $m=1$, which is obvious. Suppose $m>1$ and develop $\det X$ along the first column: \begin{multline} \det X= (-1)^{1+1}a_{11}\det\begin{bmatrix} A_1 & C_1 \\ 0 & B\end{bmatrix}+ (-1)^{2+1}a_{21}\det\begin{bmatrix} A_2 & C_2 \\ 0 & B\end{bmatrix}\\ +\dots+ (-1)^{m+1}a_{m1}\det\begin{bmatrix} A_m & C_m \\ 0 & B\end{bmatrix} \end{multline} By induction hypothesis, we have, for $i=1,2,\dots,m$, $$ \det\begin{bmatrix} A_i & C_i \\ 0 & B\end{bmatrix}=\det A_i\det B $$ so \begin{align} \det X&= (-1)^{1+1}a_{11}\det A_1\det B+ (-1)^{2+1}a_{21}\det A_2\det B\\ &\qquad\qquad+\dots+ (-1)^{m+1}a_{m1}\det A_m\det B\\ &= \bigl((-1)^{1+1}a_{11}\det A_1+ (-1)^{2+1}a_{21}\det A_2+\dots+ (-1)^{m+1}a_{m1}\det A_m\bigr)\det B\\ &=\det A\det B \end{align}
Solution 2:
If $\det(A)=0,$ then the columns of $A$ are linearly dependent. Then the first $m$ columns of $X$ are linearly dependent. In that case, $\det(X)=0=\det(A)\,\det(B).$ So, suppose $\det(A)\neq 0.$ Then $A$ is invertible. Notice that $$\begin{bmatrix} A & C \\ 0 & B\end{bmatrix} = \begin{bmatrix} A & 0 \\ 0 & I_n\end{bmatrix} \, \begin{bmatrix} I_m & A^{-1}C \\ 0 & B\end{bmatrix}.$$ Using the fact that determinant of a product is the product of determinants, we obtain
$$\det(X)=\det \begin{bmatrix} A & 0 \\ 0 & I_n\end{bmatrix}\, \det \begin{bmatrix} I_m & A^{-1}C \\ 0 & B\end{bmatrix}=\det(A)\,\det(B).$$Arvind