1
$\begingroup$

I am studying the linear Independence and linear dependence of the vectors in my Linear algebra course and i am confused about the following theorem (The book is Linear Algebra and its applications by David C Lay) and i need some explanation of this theorem. enter image description here

In this theorem i am confused about "If S is linearly dependent and V1 not equal to zero then some Vj (with j>1) is a linear combination of the preceeding vectors"

4 Answers 4

1

Recall that a vector $w$ is a linear combination of $w_1,\ldots,w_k$ if there exist scalars $\lambda_1,\ldots,\lambda_k$ such that $w = \lambda_1w_1 + \cdots + \lambda_kw_k.$ A set $\{w_1,\ldots,w_k\}$ is linearly dependent if there exists a non-trivial solution to $\lambda_1w_1+\cdots+\lambda_kw_k = 0$, i.e. not all $\lambda_i = 0.$

Assume that $v_1 \neq 0$ and that $\{v_1,\ldots,v_n\}$ is linearly dependent.

Since $v_1 \neq 0$ it follows that $\{v_1\}$ is linearly independent.

Next, consider $\{v_1,v_2\}$. Either $\{v_1,v_2\}$ is linearly independent (LI) or linearly dependent (LD). If $\{v_1,v_2\}$ is LD then you have your $v_j$, namely $j=2$, because $\{v_1\}$ was LI while $\{v_1,v_2\}$ is LD meaning that $v_2$ is a linear combination of $v_1$.

If $\{v_1,v_2\}$ is LI then consider $\{v_1,v_2,v_3\}$. Either $\{v_1,v_2,v_3\}$ is LI or LD. If $\{v_1,v_2,v_3\}$ is LD then you have your $v_j$, namely $j = 3$, because $\{v_1,v_2\}$ was LI while $\{v_1,v_2,v_3\}$ is LD meaning that $v_3$ is a linear combination of $v_1$ and $v_2$.

If $\{v_1,v_2,v_3\}$ is LI then consider $\{v_1,v_2,v_3,v_4\}$, etc.

Continue this process. You will eventually find a $v_j$ for which $\{v_1,\ldots,v_{j-1}\}$ is LI and $\{v_1,\ldots,v_j\}$ is LD meaning that $v_j$ is a linear combination of $v_1,\ldots,v_{j-1}.$ If you don't find such a $v_j$ then you have a contradiction: you would have that $\{v_1,\ldots,v_n\}$ was LI, but you assumed it was LD!

0

What exactly is your problem; general confusion is not well-defined.

It means that one of the vectors is a linear combination of all of those before, in other words, you can usually pick the last vector if the set is finite. If it is infinite, this theorem still holds true (as there cannot be more then $n$ linear independent vectors in a $n$-dimensional vector space).

An example: Let our set be $S=\{(1,2),(3,4),(4,6),(1,3)\}$ now, both $(4,6)$ and $(1,3)$ are linearly dependent to their predecessors, which are in turn, $\{(1,2),(3,4)\}$ and $\{(1,2),(3,4),(4,6)\}$. In fact, $(1,3)$ is already linear dependent to the first 2 vectors; the third is just "overkill".

0

If $S=\left\{v_1,v_2,\ldots,v_\rho\right\}$ is linearly depended then \begin{equation}\lambda_1v_1+\lambda_2v_2+\cdots+\lambda_\rho v_\rho=0\end{equation} for some $\lambda_i$ not all zero. Let $j\in \{1,2,\ldots\rho\}$ be the least integer s.t. $\lambda_j=\lambda_{j+1}=\cdots=\lambda_\rho=0$. If we suppose that $v_1\neq0$ then $j>1$. Therefore \begin{equation}\lambda_1v_1+\cdots+\lambda_{j-1}v_{j-1}=0\end{equation} and $\lambda_{j-1}\neq0$. We conclude that \begin{equation}v_{j-1}=-\dfrac{\lambda_1}{\lambda_{j-1}}v_1-\cdots-\dfrac{\lambda_{j-2}}{\lambda_{j-1}}v_{j-2}.\end{equation}

0

It seems to be that "in fact..." sentence that is getting to you. Here are two steps for getting through it.

  1. If one vector (call it $v$) is a linear combination of some others (let's say $v_a,\,v_b,\,v_c$) then that means that each of $v,\,v_a,\,v_b,\,\text{and}\,v_c$ is a linear combination of the others. (To see this, note that $v$ being a linear combination means that $v=av_a+bv_b+cv_c$, which means $v-av_a-bv_b-cv_c=0$, which means $\frac{1}{a}v-v_a-\frac{b}{a}v_b-\frac{c}{a}v_c=0$, which means $\frac{1}{a}v-\frac{b}{a}v_b-\frac{c}{a}v_c=v_a$).

  2. Now take whichever of the vectors $v,\,v_a,\,v_b,\,\text{and}\,v_c$ comes last in the sequence $v_1,\,v_2,...\,v_n$. That vector is a linear combination of the others, and that vector comes after the others.

  3. So now you have a vector in $v_1,\,v_2,...\,v_n$ which is a linear combination of some of its predecessors in the sequence. You are nearly there.

  4. To get from this to "there is a vector which is a linear combination of all its predecessors", take the linear combination you've discovered in step 3, then add in all the other vectors you haven't mentioned yet, giving a coefficient of $0$ to each of them.