6
$\begingroup$

Prove or disprove:

Given a square matrix $A$,the columns of $A$ are linearly independent iff. the rows of $A$ are linearly independent.

  • 0
    I didn't answer, simply because you haven't mentioned what you're allowed to use. For instance, one (lazy) way of seeing this is that since a matrix and its transpose are similar matrices, then the singularity of one implies the singularity of the other. But that might be too high-powered for the matter at hand...2012-07-20

1 Answers 1

9

Here's an argument more-or-less from first principles.

If the rows of $A$ are linearly independent, then the result of doing row-reduction to $A$ is the identity matrix, so the only solution of $Av=0$ is $v=0$.

If the columns of $A$ are linearly dependent, say, $a_1c_1+a_2c_2+\cdots+a_nc_n=0$ where the $c_i$ are the columns and the $a_i$ are not all zero, then $Av=0$ where $v=(a_1,a_2,\dots,a_n)\ne0$

So, if the columns are dependent, then so are the rows.

Now apply the same argument to the transpose of $A$ to conclude that if the rows of $A$ are dependent then so are the columns.

  • 0
    I think the proof could be shorter: If the rows of $A$ are linearly independent, then the result of doing row-reduction to $A$ is the identity matrix, so the only solution of $Av=0$ is $v=0$. We have that rows lin. ind. $\implies$ columns lin. independent. Now let's say that columns are independent. Transpose the matrix and apply the same argument. Now we have rows are linearly independent $\iff$ columns linearly independent. And it implies that rows lin. dependent $\iff$ columns lin. dependent.2016-04-06