1
$\begingroup$

Theorem

Let $A$ be some $n \times n$ matrix, and $B$ be an $n \times n$ matrix

which is the result of performing a row swap operation on $A$. Column swaps won't be touched on for brevity's sake.

Let $b_{ij}$ be an element of $B$ at row $i$ column $j$.

Let $C_{ij} = (-1)^{i+j}M_{ij}$ be the cofactor of $b_{ij}$, and $M_{ij}$ be its corresponding minor. Substitute $a_{ij}$ and $A$ for the same definitions of minor/cofactor etc.

Then

$$\det B = \sum^{n}_{k=1} b_{ik}C_{ik}=-\det A= -\sum^n_{k=1}a_{ik}C_{ik}$$

Given that using row expansion off of any row of $A$ always produce $\det A$. Ditto for $\det B$ and any row of $B$.

Problem

I've been in the process of trying to prove this. Here are my thoughts, so far.

If $i$ is the row that's been interchanged with some other row $l \ne i$, then we can make the connection that:

$$a_{ij} = b_{lj}$$

and

$$a_{lj} = b_{ij}$$

Since $i \ne l$, then $(-1)^{l+j}$ doesn't necessarily equal $(-1)^{i+j}$.

But, if both $i$ and $l$ are even or both $i$ and $l$ are odd, then the sign itself of the cofactor won't change, which means that we can't necessarily factor -1 from either row via the outer sign of the cofactor at least.

Question

I'm not looking for someone to give me the proof; I'd like to solve it myself, but what I'm trying to figure out is whether or not I'm on the right track for finding the relationship; if not, what is a good direction to point me in?

  • 0
    The development by minors is not a viable approach for this proof because the minors that contain the swapped rows will take unrelated values.2017-02-03

3 Answers 3

0

I would work from a different definition of the determinant, either the view as alternating multi-linear form: $$ \det : V^n \to F \\ \det(A) = \det(a_1, \dotsc, a_n), A = (a_1, \dotsc, a_n) $$ or use $$ \DeclareMathOperator{sgn}{sgn} \det(A) = \sum_{\pi \in S_n} \sgn(\pi) \, a_{1\pi(1)} \dotsb a_{n\pi(n)} $$

0

To swap the i and j'th row of A, multiply A on the right side such that $M_{i,j} = 1, M_{j,i} = 1, M_{k,k} = 1$ if $k\ne i$, or $k\ne j$ and $0's$ elsewehere.

$MA = B\\ \det MA = \det M \det A\\ \det M = -1$

0

The determinant is the sum of all products $a_{i,\sigma_k(i)}$ over all $n!$ permutations $\sigma_k$, with a $\pm1$ signature depending on the parity of the number of inversions.

Swapping two rows amounts to reordering the permutations. By commutativity of the product all terms remain equal, but all signatures get negated as the number of inversions changes by an odd number.

For the sake of illustration,

  • before: $123+,231+,312+,321-,132-,213-$

  • after swapping $2,3$: $132-,321-,213-,231+,123+,312+$