3
$\begingroup$

I have a $2\times2$ block matrix $M$ defined as follows:

$\begin{pmatrix}X+|X| & X-|X| \\ Y-|Y| & Y+|Y|\end{pmatrix}$

where $X$ and $Y$ are $n\times n$ matrices and $|X|$ denotes the modulus of the entire matrix $X$ that essentially comprises modulus of individual elements of $X$.

How may I find the determinant of the matrix $M$ in terms of $X$ and $Y$? Looking for a simplified solution?

  • 0
    yes assume the non singularity of the matrices.2011-12-23

2 Answers 2

2

You can at least clean it up a bit:

$ \left|\begin{pmatrix}X+|X| & X-|X| \\ Y-|Y| & Y+|Y|\end{pmatrix}\pmatrix{\frac{1}{2}I &I\\-\frac{1}{2}I &I}\right| = \left|\begin{pmatrix}|X| & 2X \\ -|Y| & 2Y\end{pmatrix}\right| $

You can also get rid of the $2$s in the right column. But still this is no different than finding $ \det\pmatrix{A &B\\C&D} $ from arbitrary $A,B,C,D$.

2

I shall assume that $X+|X|$ is invertible, although a similar solution exists under the assumption that $Y+|Y|$ is. I shall use $A,B,C,D$ to denote the respective block matrices in your problem to avoid giant equations. The decomposition $M = \begin{pmatrix}A & B \\ C & D\end{pmatrix} = \begin{pmatrix}A & 0\\ C & I\end{pmatrix} \begin{pmatrix}I & A^{-1}B\\ 0 & D-CA^{-1}B\end{pmatrix} = ST$ can be verified by simple matrix multiplication (noting that matrix multiplication for block matrices works like multiplying matrices over any other noncommutative ring). The key fact is that $\det(S) = \det(A)$ and $\det(T) = \det(D-CA^{-1}B)$. I shall only prove the first equality (or rather a stronger statement where $I$ is not necessarily the same size as $A$ but the block matrix is still square), as the second can be proved similarly. If $A$ is $1\times 1$, the equation follows from the fact that $S$ is triangular and the product along its diagonal is $\det(A)$. If we assume that it holds for any $n\times n$ matrices $A,C$ then we can apply the Laplace formula to get $\det(S) = \sum\limits_{j=1}^{n+1} (-1)^{j+1}a_{1j}\det(N_{1j})$ where $N_{1j}$ is the matrix that results from deleting the first row and $j^{th}$ column of $S$. These matrices satisfy the inductive hypothesis (the identity matrix has not been touched), and so $\det(N_{1j}) = \det(M_{1,j})$ where $M_{1j}$ is the matrix that results from deleting the first row and $j^{th}$ column of $A$. Using the Laplace formula again gives $\sum\limits_{j=1}^{n+1} (-1)^{j+1}a_{1j}\det(N_{1j}) = \sum\limits_{j=1}^{n+1} (-1)^{j+1}a_{1j}\det(M_{1j}) = \det(A)$ completing the proof. Since $\det(M) = \det(S)\det(T)$, this gives us $\det(M) = \det(A)\det(D-CA^{-1}B)$

  • 0
    No problem, that went a little wrong in my comment, I suppose. What I meant was that there is a nice shortcut to your argument. Other than it's a nice answer.2011-12-23