1
$\begingroup$

Given $m \times n$ real matrix $A$, where $m$Ax=b$ for $b\neq 0$ can be described as

$$\mathcal{A}(v):=\{v+w | Aw=0\}=v+W$$ where $v$ satisfies $Av=b$.

Does the maximum size of linearly independent set in $\mathcal{A}(v)$ always equal to $\dim W$?

in other words:

Does the maximum numbers of independent solutions of the underdetermined system $Ax=b$ always equal to the nullity of $A$?

I can only seeing this by plotting the solutions.

My idea seems naive:

Suppose that $\dim W=r$, and we have $r+1$-linearly independent vectors in $v+W$ say

$$v+x_1,v+x_2,\cdots,v+x_{r+1}$$

Can we prove that $x_1,x_2,\cdots,x_{r+1}$ are also linearly independent? (in order to get a contradiction)

  • 0
    It is easy to see that the maximum number of linearly independent vectors in $\mathcal{A}(v)$ is at least $\dim(W)$: if $w_1,\ldots,w_n$ are linearly independent, and $v$ is not in their span, then $w_1,\ldots,w_n,v$ are linearly independent, and hence $w_1+v,w_2+v,\ldots,w_n+v, v$ are linearly independent, and hence $w_1+v,\ldots,w_n+v$ are linearly independent. I'm not sure about the converse yet.2012-07-04
  • 0
    yes, I have that result too. Thank you for confirming :)2012-07-04
  • 0
    Judging from the plot, for example when $null(A)=2$, then the solution sets is a "plane" (not necessarily a subspace) since it does not through the origin, since it is being shifted by the vector $v$. This "plane" should be 2-dimensional isn't it?2012-07-04
  • 0
    But that's **affine** dimension, which is not necessarily the same as the number of linearly independent vectors with "ends" in the plane. See the example by copper.hat below.2012-07-04
  • 0
    (Duh to me: $v$, $v+w_1,\ldots,v+w_n$ are all elements of $\mathcal{A}(v)$, and as noted above they are linearly independent. But $0$, $w_1,\ldots,w_n$ are not linearly independent elements of $W$).2012-07-04

3 Answers 3

2

This is an answer to your question "Does the maximum size of linearly independent set in $\mathcal{A}(v)$ always equal to $\dim W$?".

Basically no. Take $A=\begin{bmatrix}1 & 0 & 0\\ 0 & 0 & 1\end{bmatrix}$ and $b=e_1$. Then $\ker A = \mathbb{sp} \{ e_2 \} $, and since $A e_1 = b$, we may take $v = e_1$. However $v+\ker A$ contains $e_1+e_2$ and $e_1-e_2$ which are linearly independent, whereas $\ker A$ has dimension $1$.

(Note however, that the affine dimension of $\mathcal{A}(v)$ always equals that of $\ker A$, as it is just a translate.)

  • 0
    ok, thanks for the example. I'm waiting for the example in undetermined system too, before I choose the answer :)2012-07-04
  • 0
    Take the system $x=0$ as a system in two unknowns, $x$ and $y$...2012-07-04
  • 0
    Oops, I completely missed the $m. I added a small fix.2012-07-04
2

There is confusion in your question about the use of "linearly independent". With the usual meaning of that term applied to elements of a vector space, namely "no nontrivial linear combination gives $0$", the answer to your question is negative: if $b\neq 0$ then $v\notin W$ and $v+W$ is an affine subspace of dimension $\dim W$, which contains linealrly independent sets of $1+\dim W$ elements, for instance $\{v,v+b_1,v+b_2,\ldots,v+b_d\}$ where $\{b_1,\ldots,b_d\}$ is a basis of $W$. As a concrete example take $A=(1~~1)$ and $b=1$ then you equation is $x+y=1$, the nullity of $A$ is $1$, but there are two solutions $(x,y)=(1,0)$ and $(x,y)=(0,1)$ that are linearly independent as vectors.

However you probably do not want to say these are two linearly independent solutions, since a linear combination of them will in general not be a solution. So you might want to define a set of solutions to be linealrly independent if after subtraction of a particular solution $v$ from all of them they become a linearly independent set of vectors. But then the answer to your question is trivially positive: after subtraction of $v$ from all the elements of $v+W$ one gets $W$, which is of course a vector subspace of dimension $\dim W$

  • 0
    I meant what you've mention in the first paragraph. I don't know how could I missed "$\{v,v+b_1,\cdots,v+b_r\}$ is always linearly independent" .. I've tried to claim/proved it before I post this question, with troubles. But It seems no trouble now: Let $\sum_{i=0}^{r} \alpha_i (v+ b_i) =0$, here $b_0=0$. Then $$(\alpha_0+\cdots+\alpha_r)v+\alpha_1 b_1 + \cdots +\alpha_r b_r=0$$ If $\alpha_0+\cdots+\alpha_r=0$, then $\alpha_i=0$ since $\{b-1,\cdots,b_r\}$ is a basis. If $\alpha_0+\cdots+\alpha_r\neq 0$ then $v \in W$.2012-07-04
0

OKay...... I'm not sure what you are trying to say by dim W. Do you mean ker A? Also when you say kernel $W=${$w | Aw=0$} I'm pretty sure you mean ker A.

review this: http://en.wikipedia.org/wiki/Kernel_of_a_matrix

Now to answer your linear independence question. Because you are assuming that $v+x_1,...v+x_{r+1}$ are linearly independent what does that mean?

It means:

$\displaystyle \sum_{i=1}^{r+1} a_i(v +x_i) = 0$ if and only if every $a_i=0$. But what is this?

$\displaystyle 0 = \sum_{i=1}^{r+1} a_i(v +x_i) = \sum_{i=1}^{r+1} a_iv + \sum_{i=1}^{r+1} a_ix_i$ We required the $a_i$ to all be identically zero, and so it follows that:

$\displaystyle \sum_{i=1}^{r+1} a_ix_i = 0$

And hence $x_1,...x_{r+1}$ are linearly independent... but again, I'm not really sure what this part has to do with your general question.... could you elaborate, please?

I suppose there would be some for of contradiction if $x_i \in $ ker($A$), as you are assuming $dim(ker(A))=r$. But I'm not sure what you are looking for... sorry

  • 0
    Your argument that $x_1,\ldots,x_r$ must be linearly independent does not follow; you did not take an arbitrary linear combination of the $x_i$ that was equal to $0$, you took a linear combination of the $v+x_i$. Say $\sum b_i x_i = 0$. How do you use the fact that $v+x_1,\ldots,v+x_r$ are linearly independent to conclude that $b_1=\cdots=b_r=0$?2012-07-04
  • 0
    yes, $\dim W = dim (ker A) = null(A)$. I can't follow your argument on "And hence $x_1,\cdots,x_{r+1}$ are linearly independent" I meant: we are given $v+x_1,\cdots,v+x_{r+1}$ vectors such that $$\sum_{i=1}^{r+1} a_i(v+x_i)=0$$ implies $a_i=0$ for all $i$, and we want to prove if $$\sum_{i=1}^{r+1} \alpha_i x_i =0$$ also would implies $\alpha_i=0$ for all $i$. This means we have $r+1$ linearly independent vectors in $W$, but $\dim W =r$, a contradiction, hence $v+x_1,\cdots,v+x_{r+1}$ can't be linearly independent.2012-07-04
  • 0
    tha t is to say: "we can't have more than $r$ linearly independent vectors in $\mathcal{A}(v)$"2012-07-04