1
$\begingroup$

I wish to prove the following, but I'm not sure if the steps of my proof are correct. Here it goes:

Suppose A is an $m \times m$ matrix with $m$ pivot columns and that $v_1, \ldots, v_p$ is a linearly independent set of vectors in $\mathbb{R}^m$.

Is $Av_1, \ldots, Av_p$ a linearly independent set of vectors?

Here's an attempt at the proof:

As all the v_p's are linearly independent, $v_i = v_j$ $\implies$ $i=j$ for some $1 \leq i,j \leq p$.

As $A$ has full rank, then $A$ has a unique solution for every right hand side $b$ of the linear equation $Av = b$. That is, each $Av_i$, where $1 \leq i \leq p$ is a unique column vector, say $\bar{v}_p$ which is not a linear combination of any other $Av_j$.

So $Av_1, Av_2, \ldots, Av_p$ are linearly independent. Any flaws of mistakes in the logic?

Ben

  • 0
    @David: other than the level? $\cdots$ (`\cdots`) is used between infix simbols, such as $+\cdots+$, or implicit multiplication $a_1\cdots a_n$; $\ldots$ (`\ldots`) is used to indicate omissions in lists, such as $v_1,\ldots,v_n$. But the *big* problem comes when you omit commas. For example, $Av_1,Av_2\cdots Av_p$ can be reasonably interpreted as being **two** objects, one being $Av_1$ and the other being the product of $Av_2$ through $Av_p$. In short: the difference is the former, with appropriate commas, is typographically correct, the latter is not (because it can lead to confusion).2011-04-05

2 Answers 2

1

Ok here's an attempt (new attempt) via contradiction.

Suppose that $A$ being a $m \times m$ matrix having exactly $m$ pivot columns and $v_1, v_2 \cdots v_p$ being linearly independent vectors implies that $Av_1, Av_2 \cdots Av_p$ are linearly dependent.

So if $Av_1 \cdots Av_p$ are linearly dependent, there exists constants $c_1 \cdots c_p$, not all zero such that the linear combination $\sum_{i=1}^{p} c_i A v_i = 0$. Now $A$ has exactly $m$ pivot columns so there is unique to solution to every R.H.S. $b$ and so the matrix $A$ is invertible. If it is invertible, I may multiply both sides of the equation by $A^{-1}$, so that I now have

$\sum_{i=1}^{p} c_i v_i = 0$. But then this means that the v_p's are all linearly dependent, which contradicts the assumption that $v_1, v_2 \cdots v_p$ are all linearly independent.

So indeed $Av_1, Av_2 \cdots Av_p$ are all linearly independent. $\hspace{2in}$ $\blacksquare$

Any objections to this proof against the logic?

Thanks, Ben

  • 0
    @knucklebumpler Ok thanks for your answer; you're saying that my wording of the first question is not very precise, I get what what you mean. So the proof is correct then, that's all I needed to know. Thanks for your help.2011-04-05
0

Yes. You fail to demonstrate that $Ax_j$ is independent of $Ax_i$ for $i \ne j$. Let's look at it:

As $A$ has full rank, then $A$ has a unique solution for every right hand side $b$ of the linear equation $Ax = b$.

This is true but does not imply,

That is, each $Ax_i$, where $1 \leq i \leq p$ is a unique column vector, say $\bar{x}_p$ which is not a linear combination of any other $Ax_j$.

the first part of which is just the definition of a function and the last part of which is not implied by anything you've thus far said. What being able to uniquely solve $Ax = b$ gets you is not that $Ax$ is unique for each $x$ (which is what you're asserting) but that $x$ is unique for each $b$. As an example consider the matrix, $A$ with $1$ in the top left corner and $0$ elsewhere. What is the vector, $v$ for which $Av \ne Av$? You'll see that all matrices have the property that you're claiming.

Your claim is true of course. If you know about the matrix inverse, then that makes it super easy. Just apply $A^{-1}$ to $\sum c_iAx_i = 0$ and see what happens.

Otherwise, you can just solve $Ax = \sum c_iAx_i$ and $Ax = 0$ and compare the results.

  • 0
    $A$ having full rank means that $A$ which is a $m \times m$ matrix has exactly $m$ pivot columns.2011-04-05