It is the same as regular multiplication except that matrix multiplication is not usually commutative.
That said I think you can develop the notation and proof by bootstrapping the 2x2 case. Suppose A is a 2x2 block matrix, say having I+J rows and K+L columns, so that the block in the upper left corner is IxK, etc. Then for B a 2x2 block matrix having K+L rows and M+N columns, the block multiplication would be compatible:
$ A = \left( \begin{array} {c,c} A_{11} A_{12} \\ A_{21} A_{22} \end{array} \right) $
$ B = \left( \begin{array} {c,c} B_{11} B_{12} \\ B_{21} B_{22} \end{array} \right) $
$ AB = \left( \begin{array} {c,c,c} {A_{11}*B_{11}+A_{12}*B_{21}} {\; \;}{A_{11}*B_{12}+A_{12}*B_{22}} \\ {A_{21}*B_{11}+A_{22}*B_{21}} {\; \;}{A_{21}*B_{12}+A_{22}*B_{22}} \end{array} \right) $
Cases with more than two blocks per row or column can then be reduced to this simple case by lumping blocks together and applying the multiplication recursively.