4
$\begingroup$

Let $A$ be an invertible $n\times n$ matrix. Define the matrix of minors $\Delta(A)$ of $A$ to be the matrix whose $(i,j)$ entry is the determinant of the minor of $A$ with the $i$th row and $j$ column omitted.

Certainly, the most important property of $\Delta(A)$ is that it is instrumental in computing the inverse of $A$: $$ A^{-1}= \frac{1}{det(A)}\sigma\cdot \Delta(A)^T\cdot \sigma$$ where here, $\sigma$ is the diagonal matrix whose diagonal entries alternate $1$ and $-1$.

From this fact and elementary properties of the inverse, it is easy to prove the following

  • Taking the matrix of minors is an involution up to scaling; that is, $\Delta(\Delta(A))=det(A)^{2-n}\cdot A$.
  • Taking the matrix of minors is an group homomorphism; that is, $\Delta(AB)=\Delta(A)\Delta(B)$.

If you actually write out either of these identities in terms of minors, you get a series of non-trivial-looking identities on the minors of an invertible matrix.

Is this the easiest way to obtain these identities on minors? It would bother me if it was, for a couple of reasons. First, it seems extremely likely that the second identity above holds for non-invertible matrices. Second, if $A$ is totally positive or non-negative (that is, every minor of every size is positive or non-negative), then so is $\Delta(A)$. However, the inverse of a totally positive matrix will almost never be totally positive, and so the above proof of the identities will stray off the totally positive path.

Therefore,

Can the identities on minors implied by the above relations on $\Delta$ be proven in a more direct way?

Ideally, this proof would be 'subtraction-free', to make it more natural in the totally-positive setting.

Edit: I should be a little bit more clear about what sort of thing I am looking for. I am interested in spaces of totally-positive matrices and the corresponding algebras generated by minors on them (specifically, they are cluster algebras). An important involution of these spaces and algebras is the matrix of minors defined above. Note that taking inverses or cofactors does not preserve totally positivity, and so it doesn't act on this space.

Mainly, I asked this question because I was annoyed at using a map whose properties couldn't be proven without passing to a different, bigger space (not that this doesn't happen all the time in math). I was curious if there was a proof along algebraic lines, following only from the simplest relations on minors (the 3-term Plucker relations).

  • 3
    Every identity which is polynomial in the entries of the matrices involved and which holds for invertible matrices in fact holds for all matrices. This is a consequence of continuity or, as Bill prefers, of universality.2011-02-07
  • 1
    What exactly you mean by 'subtraction-free'? Do you have a subtraction free way of defining the actual minors?2011-02-07
  • 0
    While definition of the minors involve subtraction, the 3-term Plucker relations are most naturally written in a subtraction-free way. This allows you to write all the minors of a matrix as subtraction-free expressions in a relatively-small number of minors (say, for instance, solid minors touching the left edge or the top edge).2011-02-07
  • 0
    I was hoping for a proof involving simpler minor identities (like Dodgson's identity), and combining/iterating it in a way which preserved positivity. It isn't supposed to be a specific request, as much as using tools more 'native' to a theory than the existence of inverses.2011-02-07
  • 0
    I misinterpreted how much you care about positivity, so I deleted my answer.2011-02-07
  • 0
    It was a fine answer, I'm just sorry I'm not able to convey the sort of thing I am looking for. Mostly, I was wondering if there was a well-known 'simple' proof of these things independent of its relation to inverses and cofactors, and I suspect the answer is no.2011-02-07

1 Answers 1