2
$\begingroup$

Edit: I'm genuinely not sure why this has gotten little activity. If somebody knows, please tell me, so I can rework it.


As a note: I am a purist, and really want to see a proof of this, but I've also got course obligations to fulfill, and so not much time. I understand that the answer involves group theory, where my backing is weak, so more complete answers would really help me.

Let $A$ be an $n\times n$ matrix of real numbers. Define the $(i, j)$th minor $M_{(i, j)}$ of $A$ to be the determinant of the $(n-1) \times (n-1)$ matrix obtained by removal of the $i$th row and $j$th column from $A$ - e.g.,

$$A = \begin{bmatrix}1 & 4 & 2 \\9 & 4 & 6 \\8 & 3 & 1 \end{bmatrix} \implies M_{(1, 1)} = \begin{vmatrix}4 & 6 \\3 & 1 \end{vmatrix}, \text{ }M_{(2,1)} = \begin{vmatrix} 4 & 2 \\ 3 & 1 \end{vmatrix}, \mathrm{etc.}$$

Define the $(i,j)$th cofactor $C_{(i, j)}$ by $C_{(i, j)} = A_{(i,j)}(-1)^{i + j}M_{(i,j)}$ - i.e., multiply $A_{(i,j)}M_{(i,j)}$ by a sign corresponding to the $(i,j)$th entry of the matrix

$$\mathrm{sgn} = \begin{bmatrix} + & - & + & \cdots \\- & + & - & \cdots \\ + & - & + &\cdots \\ \vdots & \vdots & \vdots & \ddots \end{bmatrix};$$

thus,

$$A = \begin{bmatrix}1 & 4 & 2 \\9 & 4 & 6 \\8 & 3 & 1 \end{bmatrix} \implies C_{(1, 1)} = 1 \times \begin{vmatrix}4 & 6 \\3 & 1 \end{vmatrix}, \text{ }C_{(2,1)} = -9 \times \begin{vmatrix} 4 & 2 \\ 3 & 1 \end{vmatrix}.$$

Finally, define the cofactor expansion $D_i(A)$ across the (i)th row of $A$ by $D_i(A) = \sum_{k = 1}^n C_{(i, k)}$. For example, we get

\begin{eqnarray} D_1 \Bigg( \begin{bmatrix}1 & 4 & 2 \\9 & 4 & 6 \\8 & 3 & 1 \end{bmatrix} \Bigg) = C_{(1, 1)} + C_{(1, 2)} + C_{(1, 3)} = \begin{vmatrix}4 & 6 \\3 & 1 \end{vmatrix} - 4 \begin{vmatrix} 9 & 6 \\ 8 & 1 \end{vmatrix} + 2\begin{vmatrix}9 & 4 \\8 & 3 \end{vmatrix} = 132 \\ D_2 \Bigg( \begin{bmatrix}1 & 4 & 2 \\9 & 4 & 6 \\8 & 3 & 1 \end{bmatrix} \Bigg) = C_{(2, 1)} + C_{(2, 2)} + C_{(2, 3)} = -9 \begin{vmatrix}4 & 2 \\3 & 1 \end{vmatrix} + 4 \begin{vmatrix} 1 & 2 \\ 8 & 1 \end{vmatrix} - 6\begin{vmatrix}1 & 4 \\8 & 3 \end{vmatrix} = 132 \\ D_3 \Bigg( \begin{bmatrix}1 & 4 & 2 \\9 & 4 & 6 \\8 & 3 & 1 \end{bmatrix} \Bigg) = C_{(3, 1)} + C_{(3, 2)} + C_{(3, 3)} = 8 \begin{vmatrix}4 & 2 \\4 & 6 \end{vmatrix} - 3 \begin{vmatrix} 1 & 2 \\ 9 & 6 \end{vmatrix} +\begin{vmatrix}1 & 4 \\9 & 4 \end{vmatrix} = 132. \end{eqnarray}

The equality of each of these expansions serves as an example of the theorem I want to see proven:

Theorem: For any row $\rho_i$ (or column, with a similar definition to that for rows), the cofactor expansion $D_i(A)$ of a given matrix $A$ has the same numerical value.


Some notes:

(1) (Yes, I do realize that all the $\TeX$ belies my claim to not have much time. What I mean is that I could use some instruction - as a way of saving quite a bit of valuable time - since I've never taken a course in algebra before.)

(2) I admit that I don't really understand the Leibniz formula, so I don't actually have an immediate definition of determinant to fall back on inductively. One way around is to prove the cofactor theorem inductively on the size of $n$. Since every property of the determinant follows (easily) from the cofactor theorem, the above theorem is all I need to have proof of at the moment.

  • 0
    @YACP I looked at [Wikipedia](http://en.wikipedia.org/wiki/Laplace_expansion) and found a related proof involving group theory. As far as I can tell, the tags abstract-algebra and finite-groups do apply.2012-12-05
  • 0
    Perhaps you can convince yourself that each term in the full expansion of the determinant will be plus or minus $\langle P,A\rangle=\mathrm{tr}(P^TA)$ for every permutation matrix $P$. Then figure out how each negative sign from a cofactor represents a single transposition, so the product of these signs gives the parity of the permutation (matrix) resulting from going all the way down to one of the smallest Russian dolls in the cofactor expansion and marking $1$s down in the matrix where we select our entries.2012-12-05
  • 0
    @anon If you could argue this further, I would appreciate it. When you use determinant in your first line, are you referring to the Leibniz determinant?2012-12-05
  • 0
    I am referring to the determinant simultaneously as the unique expression resulting from the various cofactor expansions and the expression given by the Leibniz formula; when thinking along the lines I gave (in need of elaboration at the moment), it is fairly evident they coincide.2012-12-05
  • 0
    @anon Yeah, let me be clear: I don't understand the Leibniz formula. I'm looking at permutation parity right now.2012-12-05

1 Answers 1