I shall assume that $X+|X|$ is invertible, although a similar solution exists under the assumption that $Y+|Y|$ is. I shall use $A,B,C,D$ to denote the respective block matrices in your problem to avoid giant equations. The decomposition $M = \begin{pmatrix}A & B \\ C & D\end{pmatrix} = \begin{pmatrix}A & 0\\ C & I\end{pmatrix} \begin{pmatrix}I & A^{-1}B\\ 0 & D-CA^{-1}B\end{pmatrix} = ST$ can be verified by simple matrix multiplication (noting that matrix multiplication for block matrices works like multiplying matrices over any other noncommutative ring). The key fact is that $\det(S) = \det(A)$ and $\det(T) = \det(D-CA^{-1}B)$. I shall only prove the first equality (or rather a stronger statement where $I$ is not necessarily the same size as $A$ but the block matrix is still square), as the second can be proved similarly. If $A$ is $1\times 1$, the equation follows from the fact that $S$ is triangular and the product along its diagonal is $\det(A)$. If we assume that it holds for any $n\times n$ matrices $A,C$ then we can apply the Laplace formula to get $\det(S) = \sum\limits_{j=1}^{n+1} (-1)^{j+1}a_{1j}\det(N_{1j})$ where $N_{1j}$ is the matrix that results from deleting the first row and $j^{th}$ column of $S$. These matrices satisfy the inductive hypothesis (the identity matrix has not been touched), and so $\det(N_{1j}) = \det(M_{1,j})$ where $M_{1j}$ is the matrix that results from deleting the first row and $j^{th}$ column of $A$. Using the Laplace formula again gives $\sum\limits_{j=1}^{n+1} (-1)^{j+1}a_{1j}\det(N_{1j}) = \sum\limits_{j=1}^{n+1} (-1)^{j+1}a_{1j}\det(M_{1j}) = \det(A)$ completing the proof. Since $\det(M) = \det(S)\det(T)$, this gives us $\det(M) = \det(A)\det(D-CA^{-1}B)$