0
$\begingroup$

I'm confused about whether or not a set $A$ of vectors in $\mathbb{R}^n$ can have less than $n$ vectors and be still be linearly independent. It would seem to me that to be linearly independent in $\mathbb{R}^n$ a set must have exactly $n$ vectors because otherwise taking each vector as a column and representing the set as a homogenous matrix in row echelon form you would end up with at least one free variable and thus infinite non-trivial solutions meaning that the set isn't linearly independent. ( Assuming that $|A| ) Put another way the matrix representation of $A$ must be row equivalent to $I_n$ ( the unit matrix ) to be linearly independent.

Am I missing something here?

  • 0
    @DidierPiau The definition in Daniel's answer.2012-03-20

2 Answers 2

5

The definition of linearly independence is usually given as follows: We say that $\{v_1,\ldots,v_k\}$ are linearly independent if given $c_1v_1+\cdots+c_kv_k=0$imply that $c_i=0$ for all $i$, then we say that the vectors are linearly independent.

Now when you are talking about size, then in $R^n$ you can have up to $n$-linearly independent vectors. In fact it is easy to see that if you have a linearly independent set of vectors, then any subset of it will have to be independent by the definition given above.

So when you say that $A$ cannot be linearly independent if its size is less than $n$ might be a confusion with $\textbf{span}$. By the span of $\{v_1,\ldots,v_k\}$ we mean all possible linear combinations of this vectors. That is $c_1v_1+\cdots+c_kv_k$where $c_i$ can be any scalar in your field of scalars.

Think of independence as not creating redundancies. For instance if you are in $R^2$ and you have the vectors $(1,0), (0,1),(1,1)$, then third vector is just the first two added, so it does not add anything new to the span.

  • 0
    So would it be accurate to say that for a set to be linearly independent then it's reduced row echelon matrix representation must be a subset of the matrix representation of the standard basis of $R^n$? I hope that makes sense.2012-03-20
0

When you study the linear independence of $K vectors in $R^n$, the system you are talking about has the k vectors as columns and the last column is vector zero (the result of the equations).

They are linear independent iff you get an echelon form with k rows different to the zero row. Because in this case you get the trivial solution. ( sure you got $n-k$ zero rows but this is not important).