0
$\begingroup$

So my suggestion was:

Assume: Let $S = \{u_1, u_2, ....u_r\}$ be a set of vectors in $\mathbb{R}^n$ . If $r > n$ then the vectors $u_1, u_2 \ldots u_r$ must be linearly dependent.

If one writes the linear system corresponding to $c_1u_1 + c_2u_2 + ... + c_ru_r$ , one will have a homogeneous system of $n$ equations in $r$ unknowns. We know that such a system has infinitely many solutions. Thus in $\mathbb{R}^n$ , a set which is linearly independent cannot contain more than $n$ vectors.

  • 1
    Look at http://en.wikipedia.org/wiki/Linear_independence#Definition. In this case, one of the vectors, say $v_n$ is the zero vector. Set $a_1=a_2=...=a_{n-1}=0$ and let $a_n$ be free, then you have satisfied the condition for linear dependence.2011-09-23

1 Answers 1

2

First of all, the definition of linear dependence: A set of $n$ vectors in a vector space $U$ is said to be linearly dependent if there exist constants $a_1 \ldots a_n$ not all zero such $a_1u_1 + \ldots a_nu_n = 0$.

So if we have a set of vectors $\{u_1, u_2 \ldots u_n\}$, if one of them is the zero vector, then we when we multiply the zero vector by a constant it is still the zero vector. What can you deduce from here?

Assume without loss of generality that $u_1$ is the zero vector in the list $\{u_1 \ldots u_n\}$. Then to make the linear combination zero, what constants can I attach to $u_1 \ldots u_n$ that will make this zero? Can I choose constants $a_1, \ldots a_n$ such that at least one of them is non-zero?