It depends on how you partition it, not all partitions work. For example, if you partition these two matrices

$$\begin{bmatrix} a & b & c \\ d & e & f \\ g & h & i \end{bmatrix}, \begin{bmatrix} a' & b' & c' \\ d' & e' & f' \\ g' & h' & i' \end{bmatrix} $$

in this way

$$ \left[\begin{array}{c|cc}a&b&c\\ d&e&f\\ \hline g&h&i \end{array}\right], \left[\begin{array}{c|cc}a'&b'&c'\\ d'&e'&f'\\ \hline g'&h'&i' \end{array}\right] $$

and then multiply them, it won't work. But this would

$$\left[\begin{array}{c|cc}a&b&c\\ \hline d&e&f\\ g&h&i \end{array}\right] ,\left[\begin{array}{c|cc}a'&b'&c'\\ \hline d'&e'&f'\\ g'&h'&i' \end{array}\right] $$

What's the difference? Well, in the first case, all submatrix products are not defined, like $\begin{bmatrix} a \\ d \\ \end{bmatrix}$ cannot be multiplied with $\begin{bmatrix} a' \\ d' \\ \end{bmatrix}$

So, what is the general rule? (Taken entirely from the Wiki page on Block matrix)

Given, an $(m \times p)$ matrix $\mathbf{A}$ with $q$ row partitions and $s$ column partitions $$\begin{bmatrix} \mathbf{A}_{11} & \mathbf{A}_{12} & \cdots &\mathbf{A}_{1s}\\ \mathbf{A}_{21} & \mathbf{A}_{22} & \cdots &\mathbf{A}_{2s}\\ \vdots & \vdots & \ddots &\vdots \\ \mathbf{A}_{q1} & \mathbf{A}_{q2} & \cdots &\mathbf{A}_{qs}\end{bmatrix}$$

and a $(p \times n)$ matrix $\mathbf{B}$ with $s$ row partitions and $r$ column parttions

$$\begin{bmatrix} \mathbf{B}_{11} & \mathbf{B}_{12} & \cdots &\mathbf{B}_{1r}\\ \mathbf{B}_{21} & \mathbf{B}_{22} & \cdots &\mathbf{B}_{2r}\\ \vdots & \vdots & \ddots &\vdots \\ \mathbf{B}_{s1} & \mathbf{B}_{s2} & \cdots &\mathbf{B}_{sr}\end{bmatrix}$$

that are compatible with the partitions of $\mathbf{A}$, the matrix product

$ \mathbf{C}=\mathbf{A}\mathbf{B} $

can be formed blockwise, yielding $\mathbf{C}$ as an $(m\times n)$ matrix with $q$ row partitions and $r$ column partitions.


It is always suspect with a very late answer to a popular question, but I came here looking for what a compatible block partitioning is and did not find it:

For $\mathbf{AB}$ to work by blocks the important part is that the partition along the columns of $\mathbf A$ must match the partition along the rows of $\mathbf{B}$. This is analogous to how, when doing $\mathbf{AB}$ without blocks—which is of course just a partitioning into $1\times 1$ blocks—the number of columns in $\mathbf A$ must match the number of rows in $\mathbf B$.

Example: Let $\mathbf{M}_{mn}$ denote any matrix of $m$ rows and $n$ columns irrespective of contents. We know that $\mathbf{M}_{mn}\mathbf{M}_{nq}$ works and yields a matrix $\mathbf{M}_{mq}$. Split $\mathbf A$ by columns into a block of size $a$ and a block of size $b$, and do the same with $\mathbf B$ by rows. Then split $\mathbf A$ however you wish along its rows, same for $\mathbf B$ along its columns. Now we have $$ A = \begin{bmatrix} \mathbf{M}_{ra} & \mathbf{M}_{rb} \\ \mathbf{M}_{sa} & \mathbf{M}_{sb} \end{bmatrix}, B = \begin{bmatrix} \mathbf{M}_{at} & \mathbf{M}_{au} \\ \mathbf{M}_{bt} & \mathbf{M}_{bu} \end{bmatrix}, $$

and $$ AB = \begin{bmatrix} \mathbf{M}_{ra}\mathbf{M}_{at} + \mathbf{M}_{rb}\mathbf{M}_{bt} & \mathbf{M}_{ra}\mathbf{M}_{au} + \mathbf{M}_{rb}\mathbf{M}_{bu} \\ \mathbf{M}_{sa}\mathbf{M}_{at} + \mathbf{M}_{sb}\mathbf{M}_{bt} & \mathbf{M}_{sa}\mathbf{M}_{au} + \mathbf{M}_{sb}\mathbf{M}_{bu} \end{bmatrix} = \begin{bmatrix} \mathbf{M}_{rt} & \mathbf{M}_{ru} \\ \mathbf{M}_{st} & \mathbf{M}_{su} \end{bmatrix}. $$

All multiplications conform, all sums work out, and the resulting matrix is the size you'd expect. There is nothing special about splitting in two so long as you match any column split of $\mathbf A$ with a row split in $\mathbf B$ (try removing a block row from $\mathbf A$ or further splitting a block column of $\mathbf B$).

The nonworking example from the accepted answer is nonworking because the columns of $\mathbf A$ are split into $(1, 2)$ while the rows of $\mathbf B$ are split into $(2, 1)$.