4
$\begingroup$

Prove or disprove:

Given a square matrix $A$,the columns of $A$ are linearly independent iff. the rows of $A$ are linearly independent.

  • 0
    i.e. "$\mathbf A$ is singular iff $\mathbf A^\top$ is singular"...2012-07-20
  • 0
    Column rank = row rank or $rank(A) = rank(A^T)$ . Read [this](http://en.wikipedia.org/wiki/Rank_(linear_algebra)#Column_rank_.3D_row_rank_or_rk.28A.29_.3D_rk.28AT.29)2012-07-20
  • 0
    @J.M. Linearly independent rows of $A$ are linearly independent columns of $A^T$, and linearly independent columns of $A^T$ make $A^T$ invertible, which in turn makes $A$ invertible, which finally gives linearly independent columns of $A$. The reverse is also true. Hence, we have proved the problem statement. However, I'm having difficulty accepting/"seeing" the proof intuitively, even though I can logically make the connections. Help!2012-07-20
  • 0
    I didn't answer, simply because you haven't mentioned what you're allowed to use. For instance, one (lazy) way of seeing this is that since a matrix and its transpose are similar matrices, then the singularity of one implies the singularity of the other. But that might be too high-powered for the matter at hand...2012-07-20

1 Answers 1

8

Here's an argument more-or-less from first principles.

If the rows of $A$ are linearly independent, then the result of doing row-reduction to $A$ is the identity matrix, so the only solution of $Av=0$ is $v=0$.

If the columns of $A$ are linearly dependent, say, $$a_1c_1+a_2c_2+\cdots+a_nc_n=0$$ where the $c_i$ are the columns and the $a_i$ are not all zero, then $Av=0$ where $$v=(a_1,a_2,\dots,a_n)\ne0$$

So, if the columns are dependent, then so are the rows.

Now apply the same argument to the transpose of $A$ to conclude that if the rows of $A$ are dependent then so are the columns.

  • 1
    I understand the proof, but still can't quite believe that columns of a square matrix being LI $\implies$ its rows are LI too. Perhaps I'll get it in the shower some day.2012-07-20
  • 0
    If you believe rows of $A$ lin. indep. implies columns of $A$ lin. indep., then you believe rows of $A$-transpose lin. indep. implies columns of $A$-transpose lin. indep. But the rows of $A$-transpose are the columns of $A$, and the columns of $A$-transpose are the rows of $A$.2012-07-21
  • 0
    I can't see how this answers the question.2015-05-10
  • 0
    @Don, I'm sorry to hear that. But if you want someone to help you see how it answers the question, you are going to have to be a lot more forthcoming about just what it is that you don't see.2015-05-10
  • 0
    I didn't know about this definition: "If the rows of A are linearly independent, then the result of doing row-reduction to A is the identity matrix, so the only solution of Av=0 is v=0." Where did you read it?2015-05-10
  • 0
    @Don, it's not a definition, it's a statement of fact. I don't remember where I first read it, but I prove it in lecture every time I teach Linear Algebra. Row operations do not affect linear dependence relations: rows are linearly dependent after row operations if and only if they were linearly dependent before the row operations. The only $n\times n$ matrix in reduced row-echelon form with linearly independent rows is the identity matrix, so if the rows of $A$ are linearly independent then row reduction can only lead to the identity matrix. Continued, next comment.2015-05-10
  • 0
    @Don (continued from last comment), if $I$ is the identity matrix, then the only solution of $Iv=0$ is $v=0$, and row operations have no effect on the solutions to systems, so, if the rows of $A$ are linearly independent, then the only solution of $Av=0$ is $v=0$.2015-05-10
  • 0
    @Don, are we OK now?2015-05-12
  • 0
    Yes, didn't realize you meant the zero vector in this line: $$a_1c_1+a_2c_2+\cdots+a_nc_n=0$$2015-05-12
  • 0
    I think the proof could be shorter: If the rows of $A$ are linearly independent, then the result of doing row-reduction to $A$ is the identity matrix, so the only solution of $Av=0$ is $v=0$. We have that rows lin. ind. $\implies$ columns lin. independent. Now let's say that columns are independent. Transpose the matrix and apply the same argument. Now we have rows are linearly independent $\iff$ columns linearly independent. And it implies that rows lin. dependent $\iff$ columns lin. dependent.2016-04-06