Showing that $\det A=\det B\cdot\det C$ when $B,C$ are the restrictions of $A$ onto a subspace

I am a bit unsure about one approach that is mentioned to prove this determinant result.

Here is the quote from Pages 100-101 of Finite-Dimensional Vector Spaces by Halmos:

Here is another useful fact about determinants. If $\mathcal{M}$ is a subspace invariant under $A$, if $B$ is the transformation $A$ considered on $\mathcal{M}$ only, and if $C$ is the quotient transformation $A/\mathcal{M}$, then

$$\det{A}=\det{B}\cdot\det{C}$$

This multiplicative relation holds if, in particular, $A$ is the direct sum of two transformations $B$ and $C$. The proof can be based directly on the definition of determinants, or, alternatively, on the expansion obtained in the preceding paragraph.

What I am confused about is how you can use the definition of determinants to conclude this result.

In this book, the determinant of a linear transformation $A$ is defined as the scalar $\delta$ such that $(\delta w)(x_{1},\cdots,x_{n})=w(A(x_{1}),\cdots,A(x_{n}))$ for all alternating $n$-linear forms $w$, where $V$ is an $n$-dimensional vector space.

It is then shown that by fixing a coordinate system (or basis) and letting $\alpha_{ij}$ be the entries of the matrix of the linear transformation under the coordinate system, the determinant of the linear transformation $A$ in that coordinate system is:

$$\det{A}=\sum_{\pi}(\text{sgn}\,\pi)\alpha_{\pi{(1)},1}\cdots\alpha_{\pi{(n)},n}$$

where the summation goes over all permutations in $\mathcal{S}_{n}$.

I have been able to use the expression involving the coordinates to show this result, but I am not sure about how this would be done directly from the definition. I have tried looking at defining other alternating forms and using their product to show this, but I was not able to make much use of that approach.

Are there any suggestions for proving this result directly from the definition?

Edit: I would like to add that part of my confusion may be from the fact that $A$, $B$ and $C$ are all linear transformations on different vector spaces and I am not sure how the definition can be used in this situation.


Solution 1:

Let $d = \dim(\mathcal M)$. Let $v_1, \dots, v_n \in V$ such that $v_1, \ldots, v_d \in \mathcal M$ is a basis of $\mathcal M$ and let $\omega$ be a non-zero alternating form on $V$. Then

$$\mathcal M \ni (w_1, \ldots, w_d) \mapsto \omega(w_1, \ldots, w_d, v_{d+1}, \ldots, v_n)$$

is an alternating form on $\mathcal M$ and

$$V \ni (w_{d+1}, \ldots, w_n) \mapsto \omega(v_1, \dots, v_d, w_{d+1}, \ldots, w_n)$$

is (or induces) an alternating form on $V/\mathcal M$. Therefore

$$\begin{eqnarray} \det(A) \, \omega(v_1, \ldots, v_n) & = & \omega(Av_1, \ldots, Av_n) \\ &=& \det(B) \, \omega(v_1, \ldots, v_d, Av_{d+1}, \ldots, Av_n) \\ &=& \det(B) \det(A/\mathcal M) \, \omega(v_1, \ldots, v_n). \end{eqnarray}$$

Since this holds for all $v_1, \ldots, v_n$ (with $v_1, \ldots, v_d \in \mathcal M$ a basis) and $\omega$ is non-zero it follows that $$\det(A) = \det(B) \det(A/\mathcal M).$$