9
$\begingroup$

I was wondering if the following lemma was easy to prove. I got a little tripped up when I saw the explicit condition that the matrix could have elements in any field not necessarily finite, so I didn't know if a standard proof from linear algebra would still apply.

Let $M$ be a matrix with entries in a (possibly infinite) field $F$.

Suppose that there exists a minor $m_n$ of order $n$ for $M$ that is nonzero and such that all minors of order $n+1$ which contain $m_n$ are zero.

How do you show the rank of the matrix is $n$?

1 Answers 1

13

That there exists a minor of order $n$ which is non-zero suggests that there exists at least $n$ linearly independent rows/columns of the matrix (namely the rows/columns of the submatrix corresponding to the minor). The rank is thus at least $n$. If the rank is larger than $n$ then there exists at least $n+1$ linearly independent rows, meaning the $n+1$ order submatrix taken using those rows will be invertible and have non-zero determinant contrary to the fact that the all $n+1$ order minors are zero. This shows that the rank must be $n$.

Edit: As Jyrki pointed out, only the minors which contain $m_n$ are zero. So let us proceed another way. The whole matrix cannot have full rank since it must contain $m_n$. That means there is at least one linearly dependent row and column which is not in $m_n$ since all rows and columns in $m_n$ are linearly independent. Removing the row and the column produces a smaller matrix which again contains $m_n$. We can repeat this procedure until all that remains is $m_n$, proving the rank is $n$.

  • 0
    Well... there is this minor detail that the assumption **only** said that those minors of size $n+1$ that **contain the given minor** of size $n$ vanish. So how do you rule out the possibility that there might be a non-vanishing minor of size $n+1$ that does not contain the given minor ;->2011-10-26
  • 0
    @Jyrki Thanks for pointing that out. I have provided an alternative argument. Please see if this argument is sufficient.2011-10-26
  • 0
    Much improved, but it sounds like you are assuming that the matrix is square?2011-10-26
  • 0
    I don't think I ever assumed that. The wording on the last sentence is perhaps a little confusing. We can keep removing columns/rows until we exhaust one or the other. At that the point because $m_n$ contains either all rows or all columns of the matrix, the rank has to be $n$.2011-10-26
  • 1
    As an example, suppose we have a $M\times N$ matrix with $N > M$. Then we would begin by taking the largest possible minor, a $M \times M$ minor which contains $m_n$. Then this minor has a linearly dependent column which we can remove to produce a $M \times N-1$ matrix. This proceed is repeated until we have just a $M \times M$ matrix, whereby we can start removing columns and rows until all we're left with is $m_n$.2011-10-26
  • 0
    +1 Yeah, you're right. I thought that you would just be removing either columns or rows, but you can do it this way.2011-10-26