Suppose one is given a set of n basis for a vector space. Then one is given another set of n linearly independent vectors each of which is a linear combination of the vectors in the original basis. Does that immediately imply that this new set of vectors is a basis for the vector space? Thanks.
Basis for a vector space
-
0I will do so ... – 2011-09-10
2 Answers
Yes. In a vector space with a finite basis, each linear independent set with the same number of elements is also a basis. See for example http://en.wikipedia.org/wiki/Steinitz_exchange_lemma.
The following exercises will greatly improve your understanding of the concepts related to your question above:
$\textbf{Question 1: }$ This question does not assume that dimension is well defined yet. Now suppose you have a basis $B$ of vectors $\{w_1 \ldots w_n\}$ of a vector space $V$ that has $n$ elements. Now suppose you have $n$ linearly independent vectors $\{v_1, v_2 \ldots v_n\}$ in $V$. Suppose you write down the matrix of the linear transformation $T$ that takes $w_1$ to $v_1$, $w_2$ to $w_n$, and $w_n$ to $v_n$. Your matrix will look like this:
$\left[\begin{array}{ccc} | & |& | \\ T(w_1) & \ldots & T(w_n) \\ | & | & | \end{array} \right].$
Recall that the $i-th$ column of this matrix is the coordinate vector of $w_i$ in the basis $B$. From just these facts, why does it follow that the vectors $v_1 \ldots v_n$ are also a basis for $V$?
The formula # of pivot variables + # of free variables = # of columns may be helpful.
$\textbf{Question 2: }$ Prove that if $U$ is a subspace of $V$ that is finite dimensional and that dim $U$ = dim $V$, then in fact $U$ = $V$.
-
0@hil linear independence just means injectivity. However what are the dimensions of the matrix? We get surjectivity once we get injectivity because your matrix is square. Now try question 2 – 2011-09-10