Here is an approach that does not rely on any explicit definition of the determinant, nor any concept of the inverse. Instead, we can start with the 3 basic properties that the determinant function should satisfy. These three properties are:
(1) Det(I) = 1
(2) The Det() function is multilinear in each of the rows (columns) individually, assuming all other rows (columns) are held constant
(3) If the matrix M is not full rank, Det(M)=0
Artin showed that these three properties alone uniquely determine the form of the determinant function (I don't prove this here). Property 3 I am using here is slightly more general than that used by Artin, but it is equally intuitive and allows me to skip a step. First, you can show that
$ Det \begin{pmatrix} A & 0 \\ C & D \end{pmatrix} = Det \begin{pmatrix} A & 0 \\ 0 & D \end{pmatrix} $
To show this, we can expand the determinant of the original matrix M as $ (4) Det(M)= Det(M_1)+Det(M_2) $ where $M_1$ is the same as the original matrix but with the first k entries of the $k+1^{th}$ row set to 0, and $M_2$ is the same as the original matrix but with the last n-k elements of the $k+1^{th}$ row seet to 0. Note that the $k+1^{th}$ row of $M_1$ and $M_2$ sum to the $k+1^{th}$ row of M, and therefore (4) holds according to property (2). Note that $ Det(M_1)=0 $ since the resulting matrix is clearly not full-rank (there are now k+1 rows that have non zero entries in only k columns). Therefore we have $Det(M) = Det(M_2)$. We can then repeat this process for each row to show that the above claim is true.
Now, let us denote $ M^*=\begin{pmatrix} A & 0 \\ 0 & D \end{pmatrix} , A^*= \begin{pmatrix} A & 0 \\ 0 & I \end{pmatrix} , D^*= \begin{pmatrix} I & 0 \\ 0 & D \end{pmatrix} $ Note that $ M^*=A^*B^* $ I claim that $ Det(M^*) = Det(A^*)*Det(D^*). $ To show this we can show that the function $(5) F(D^*)=F(d^*_1,...,d^*_n)=\frac{Det(A^*D^*)}{Det(A^*)}$ also satisfies properties (1)-(3). Clearly if $D=I$, then the RHS of (5) reduces to 1. To show (ii), We can write the j-th column of $M^*=A^**D^*$ as $A^*d_j$. Since we already know the determinant is multilinear in each column, and $A^*d^*_j$ is a linear function of d^*_j, it follows that $F(d^*_1,...,d^*_n)$ is also multininear in each of the columns ${d^*_1,...,d^*_n}$. Finally, to show (iii), we can note that if $d^*_i=d^*_j$ then $A^*d^*_i=A^*d^*_j$, so the numerator on the RHS would be 0. Therefore, $F(d^*_1,...,d^*_n)=Det(d^*_1,...,d^*_n)$, so $Det(A^**D^*)=Det(D^*)*Det(A^*)$, as desired.
In order to finish the proof, one final step is needed. We need to show that $Det(A^*)=Det(A)$ and $Det(D^*)=Det(D)$. We use the same approach as above, by defining a target function which we show to be equal to the determinant. We do this by showing that the function $C(A)=Det(A^*)$ also satisfies properties (1)-(3) as a function of the columns of A, and therefore must be equal to its determinant. The basic ideas are that if A is the identity then so is $A^*$, so property (1) follows; any inear operation on a row (column) of A is also a linear operation on a row (column) of $A^*$, so (2) holds, and if A is not full rank, then neither is $A^*$, so (3) holds. Therefore, the function C(A), which extends A to $A^*$ by adding n-k standard basis vectors and then takes the determinant of the expanded matrix, is in fact equal to $Det(A)$.
Given all of this, we immediately get the result, since $Det(M)=Det(M^*)=Det(A^**D^*)=Det(A^*)*Det(D^*)=Det(A)*Det(D) $