0
$\begingroup$

I could do this problem with bruteforce but I think there must be some elegant theorem that helps to calculate the determinant with the block matrix (here having symmetric matrices inside) such as:

$$B=\begin{pmatrix}1 & 1 & 4 & 5 \\ 1 & 1 & 5 & 4 \\ 2 & 4 & 1 & 1 \\ 4 & 2 & 1 & 1 \\\ \end{pmatrix}=\begin{pmatrix}I_{2,2} & S_{2,2,1} \\ S_{2,2,2} & I_{2,2}\end{pmatrix}$$

Actually, look this one

$$B= \begin{pmatrix}1 & 1 & 2 & 4 \\ 1 & 1 & 4 & 2 \\ 2 & 4 & 1 & 1 \\ 4 & 2 & 1 & 1 \\ \end{pmatrix}+ \begin{pmatrix} 0 & 0 & 2 & 1 \\ 0 & 0 & 1 & 2 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\\ \end{pmatrix}$$

and now I am thinking how I could use this one to speed up the calculation...determinant over this-kind-of-matrix-sum?

The problem is booringly stated as with Gaus method but I am interested to find some trick to calculate the determinante. My first idea was to do $4-3$ -row-minus and $1-2$ -row-minus (so getting some ones away but there must be some theorem to simplify the monotonous Gaussian elimination and determinant finding).

Page 741 here.

References by J.D. for further research

  1. http://en.wikipedia.org/wiki/Determinant#Block_matrices

  2. http://rscosan.com/documents/RCTM08_rcostas.pdf

  3. http://mth.kcl.ac.uk/~jrs/gazette/blocks.pdf

  • 0
    Matrix $B$ is not symmetric.2012-03-05
  • 0
    @GerryMyerson: yes but the matrices inside are, I am trying to break this puzzle into parts and try to find some elegant way to solve this (without the bruteforce and actuallying doing everything one-by-one).2012-03-05
  • 0
    ...well I have solved this problem a long time ago with the knowledge that row-change means the parity thing with the minus so this is not really a homework, I am more here interested about a general theorem that deals with this kind of symmetric inner matrices. So do not provide the solution but try to address the general case.2012-03-05
  • 1
    Semi-relevant: http://en.wikipedia.org/wiki/Determinant#Block_matrices2012-03-05
  • 1
    and slide 7 of this: http://www.rscosan.com/documents/RCTM08_rcostas.pdf2012-03-05
  • 0
    If you know that matrix $B$ isn't symmetric, you shouldn't write "symmetric matrix such as $B$." Writing $B$ as a sum of symmetric matrices won't help unless you have some formula for $\det(X+Y)$ in terms of $\det X$ and $\det Y$. I think there isn't one, in general, and I doubt there's one even if you insist on $X$ and $Y$ being symmetric.2012-03-05
  • 1
    Also this http://www.mth.kcl.ac.uk/~jrs/gazette/blocks.pdf2012-03-05
  • 0
    @J.D.: I gathered the references to the question (trying to keep things tidy) but you could easily make an answer with the details in comments, perhaps clarifying things.2012-03-05

1 Answers 1

2

Since the matrices $S_{2,2,2}$ and $I_{2,2}$ commute it follows that $det(A)=det(I_{2,2}I_{2,2}-S_{2,2,1}S_{2,2,2})$.

  • 0
    ...why do the matrices $S_{2,2,2}$ and $I_{2,2}$ need to commute?2012-03-05
  • 2
    @hhh $\det(\begin{matrix}A&B\\C&D\end{matrix}) = \det(D)\det(A-BD^{-1}C)$ $= \det(A-BD^{-1}C)\det(D) = \det(AD - BD^{-1}CD)$. First equality is a property of determinants. Second equality by commutativity of scalars, and third equality is an identity of determinants under multiplication. If $C, D$ commute then $CD = DC$ and $BD^{-1}CD = BD^{-1}DC = BC.$ Hence $\det(\cdot) = \det(AD-BC)$. In your case, $A = D = I_{2,2}$ and $B = S_{2,2,1}$ and $C = S_{2,2,2}$.2012-03-05
  • 0
    I wish I could somehow copy-paste this LaTex to a separate answer to make it more visually-pleasing, @mods anyway? Also, I would like to move the references to an answer...2012-03-05