Multiplication of block matrices

My textbook says that multiplying block matrices works the same as regular matrix multiplication (granted the dimensions of the submatrices are appropriate). Wikipedia also has an example saying so.

It seems like proving it is purely technical and yet I'm having trouble putting it into words. What would be a good way to go?


Solution 1:

It is the same as regular multiplication, except that matrix multiplication is not usually commutative. This means we have to pay attention to the order in which our blocks are multiplied.

That said I think you can develop the notation and proof by bootstrapping the $2\times 2$ case. Suppose $A$ is a $2\times 2$ block matrix, say having $I+J$ rows and $K+L$ columns, so that the block in the upper left corner is $I\times K$, etc. Then when $B$ is a $2\times 2$ block matrix having $K+L$ rows and $M+N$ columns, the block multiplication $AB$ would be compatible:

$$ A = \left( \begin{array} {c,c} A_{11} A_{12} \\ A_{21} A_{22} \end{array} \right) $$

$$ B = \left( \begin{array} {c,c} B_{11} B_{12} \\ B_{21} B_{22} \end{array} \right) $$

$$ AB = \left( \begin{array} {c,c,c} {A_{11}*B_{11}+A_{12}*B_{21}} {\; \;}{A_{11}*B_{12}+A_{12}*B_{22}} \\ {A_{21}*B_{11}+A_{22}*B_{21}} {\; \;}{A_{21}*B_{12}+A_{22}*B_{22}} \end{array} \right) $$

Cases with more than two blocks per row or column can then be reduced to this simple case, by lumping blocks together and applying the multiplication recursively.