5
$\begingroup$

The following is a problem from Daniel Norman's Introduction to Linear Algebra. Consider a $m\times n$ matrix, $M$, with rank $r$. We wish to show that the matrix $M^\mathrm{T}M$ also has rank $r$. Now, I can show this fact by proving that the nullspace of $M^\mathrm{T}M$ is the same as the nullspace of $M$ whereby the rank-nullity theorem will take care of the rest, but the hint of the problem proposes something confusing to me.

The hint suggests looking at the matrix $N$ obtained by exchanging two columns of $M$. It then asks "What is the relationship between $N^\mathrm{T}N$ and $M^\mathrm{T}M$?"

I can see that the two matrices are orthogonally similar, more specifically I can see that

$E^\mathrm{T}M^\mathrm{T}ME = N^\mathrm{T}N$ where $E = E^\mathrm{T}$ is an elementary matrix exchanging the columns in question. But I fail to see anything which leads to a resolution of this problem. Can anyone offer some help?

1 Answers 1

1

I think that the point is that you can do more than one elementary operation. Rank $r$ means that there are exactly $r$ linearly independent columns. So you can get a permutation $E$ (now a product of elementary matrices) such that $N=EME$ has its first $r$ columns linearly independent, and the last $n-r$ are linear combinations of the first $r$.

We thus have $ N=[N_1\cdots N_n], $ where $N_1,\ldots,N_n$ are the columns of $M$ ordered in such a way that the first $r$ are linearly independent and the last $n-r$ are linear combinations of the first $r$. So $ N=[N_1\cdots N_r\ \sum_{k=1}^r\alpha_{1,k}N_k\cdots\sum_{k=1}^r\alpha_{n-r,k}N_k], $ and $ N^TN=\begin{bmatrix} N_1^TN_1&\cdots&N_1^TN_r&\sum_{k=1}^r\alpha_{1,k}N_1^TN_k&\cdots&\sum_{k=1}^r\alpha_{n-r,k}N_1^TN_k\\ \vdots& &\vdots&\vdots& &\vdots\\ N_r^TN_1&\cdots&N_r^TN_r&\sum_{k=1}^r\alpha_{1,k}N_r^TN_k&\cdots&\sum_{k=1}^r\alpha_{n-r,k}N_r^TN_k \end{bmatrix} $

We see that the last $n-r$ columns are linear combinations of the first $r$, so the rank of $N^TN$ is at most $r$.

But the first $r$ columns are linearly independent: if $ 0=\sum_{k=1}^r\beta_k\begin{bmatrix}N_1^TN_k\\ \vdots \\N_r^TN_k\end{bmatrix} $ for some coefficients $\beta_1.\ldots,\beta_r$, then $ 0=\sum_{k=1}^r\beta_kN_j^TN_k,\ \ j=1,\ldots,r. $ But then $ \left(\sum_{k=1}^r\beta_kN_k\right)^T\sum_{k=1}^r\beta_kN_k =\sum_{j=1}^r\beta_j\left(\sum_{k=1}^r\beta_kN_j^TN_k\right)=0 $ So $\sum_{k=1}^r\beta_kN_k=0$, and the linearly independence of $N_1,\ldots,N_r$ implies that $\beta_1=\beta_2=\cdots=\beta_r=0$.

Thus the rank of $N^TN$ is at least $r$. But we knew it was also at most $r$, so the rank of $N^TN$ is $r$. As the rank is preserved by similarity, the rank of $M^TM$ equals that of $N^TN$ and is $r$.