1
$\begingroup$

I wish to prove the following, but I'm not sure if the steps of my proof are correct. Here it goes:

Suppose A is an $m \times m$ matrix with $m$ pivot columns and that $v_1, \ldots, v_p$ is a linearly independent set of vectors in $\mathbb{R}^m$.

Is $Av_1, \ldots, Av_p$ a linearly independent set of vectors?

Here's an attempt at the proof:

As all the $v_p's$ are linearly independent, $v_i = v_j$ $\implies$ $i=j$ for some $1 \leq i,j \leq p$.

As $A$ has full rank, then $A$ has a unique solution for every right hand side $b$ of the linear equation $Av = b$. That is, each $Av_i$, where $1 \leq i \leq p$ is a unique column vector, say $\bar{v}_p$ which is not a linear combination of any other $Av_j$.

So $Av_1, Av_2, \ldots, Av_p$ are linearly independent. Any flaws of mistakes in the logic?

Ben

  • 1
    Where did "which is not a linear combination of any other $Ax_j$" come from? Note that especially in this area things can be set up from lots of different starting points, and then in one approach $A$ may be proved from $B$ whereas in another approach $B$ is proved from $A$, so it might help if you make more explicit what you're taking as known.2011-04-05
  • 0
    You switched from $v$ to $x$ halfway.2011-04-05
  • 0
    Hi, so I have not shown that $Av_1$, $Av_2$ $\cdots $Av_p$ are not linearly independet. How can I continue from where I left off?2011-04-05
  • 0
    @David: You should use `,\ldots,` rather than `,\cdots`.2011-04-05
  • 0
    @Arturo $\ldots$ are better as they are on the same "level" as the underscore symbol, thanks.2011-04-05
  • 0
    @David: other than the level? $\cdots$ (`\cdots`) is used between infix simbols, such as $+\cdots+$, or implicit multiplication $a_1\cdots a_n$; $\ldots$ (`\ldots`) is used to indicate omissions in lists, such as $v_1,\ldots,v_n$. But the *big* problem comes when you omit commas. For example, $Av_1,Av_2\cdots Av_p$ can be reasonably interpreted as being **two** objects, one being $Av_1$ and the other being the product of $Av_2$ through $Av_p$. In short: the difference is the former, with appropriate commas, is typographically correct, the latter is not (because it can lead to confusion).2011-04-05

2 Answers 2

1

Ok here's an attempt (new attempt) via contradiction.

Suppose that $A$ being a $m \times m$ matrix having exactly $m$ pivot columns and $v_1, v_2 \cdots v_p$ being linearly independent vectors implies that $Av_1, Av_2 \cdots Av_p$ are linearly dependent.

So if $Av_1 \cdots Av_p$ are linearly dependent, there exists constants $c_1 \cdots c_p$, not all zero such that the linear combination $\sum_{i=1}^{p} c_i A v_i = 0$. Now $A$ has exactly $m$ pivot columns so there is unique to solution to every R.H.S. $b$ and so the matrix $A$ is invertible. If it is invertible, I may multiply both sides of the equation by $A^{-1}$, so that I now have

$\sum_{i=1}^{p} c_i v_i = 0$. But then this means that the $v_p's$ are all linearly dependent, which contradicts the assumption that $v_1, v_2 \cdots v_p$ are all linearly independent.

So indeed $Av_1, Av_2 \cdots Av_p$ are all linearly independent. $\hspace{2in}$ $\blacksquare$

Any objections to this proof against the logic?

Thanks, Ben

  • 1
    My only gripe is with the wording of the first sentence. You are not supposing that being an $m \times m$ rank-$m$ matrix _implies_ that it maps independent vectors to dependent vectors. You are assuming that it happens even once. This is what your argument actually proves but it could be worded more clearly. Something like: "Suppose that A is an $m \times m$, rank-$m$ matrix, that $v_1, v_2, \ldots, v_p$ are linearly independent and that $Av_1, \ldots Av_p$ are dependent.2011-04-05
  • 0
    Another way to do it without contradiction is to suppose that you have two linear expressions $\sum a_iAv_i = \sum b_iAv_i$ and take their difference to get $\sum (a_i - b_i)Av_i = 0$. Then if you apply $A^{-1}$ you'll see that you can make a statement about all the different $a_i - b_i$'s.2011-04-05
  • 0
    @knucklebumpler Ok thanks for your answer; you're saying that my wording of the first question is not very precise, I get what what you mean. So the proof is correct then, that's all I needed to know. Thanks for your help.2011-04-05
0

Yes. You fail to demonstrate that $Ax_j$ is independent of $Ax_i$ for $i \ne j$. Let's look at it:

As $A$ has full rank, then $A$ has a unique solution for every right hand side $b$ of the linear equation $Ax = b$.

This is true but does not imply,

That is, each $Ax_i$, where $1 \leq i \leq p$ is a unique column vector, say $\bar{x}_p$ which is not a linear combination of any other $Ax_j$.

the first part of which is just the definition of a function and the last part of which is not implied by anything you've thus far said. What being able to uniquely solve $Ax = b$ gets you is not that $Ax$ is unique for each $x$ (which is what you're asserting) but that $x$ is unique for each $b$. As an example consider the matrix, $A$ with $1$ in the top left corner and $0$ elsewhere. What is the vector, $v$ for which $Av \ne Av$? You'll see that all matrices have the property that you're claiming.

Your claim is true of course. If you know about the matrix inverse, then that makes it super easy. Just apply $A^{-1}$ to $\sum c_iAx_i = 0$ and see what happens.

Otherwise, you can just solve $Ax = \sum c_iAx_i$ and $Ax = 0$ and compare the results.

  • 1
    Again, it depends on what is assumed as known and what the definitions are. One possibly definition of "$A$ has full rank" is that the dimension of the image of the entire $m$-dimensional space under $A$ is $m$-dimensional, and in that case the result is almost immediate from $A$ having full rank, since the image is spanned by the $Ax_i$.2011-04-05
  • 0
    @joriki. Good point. I've at least updated my answer with an analysis of the attempted proof.2011-04-05
  • 0
    $A$ having full rank means that $A$ which is a $m \times m$ matrix has exactly $m$ pivot columns.2011-04-05