5
$\begingroup$

Suppose I wish to show that for a finite dimensional vector space, $V$, with basis $B=\{b_1,...,b_n\}$ and a given subspace $X$ of $V$, there exists a subset of $B$ that generates a subspace $Y$, such that $V=X\bigoplus Y$. Would the following argument make sense?

Suppose $X$ has basis $B_X$ which elements are expressed in terms of $B$. List them, together with the elements of $B$, in a matrix and perform Gaussian elimination preserving the $B_X$ rows. then the remaining (non-zero) vectors in the matrix, $M$, will be a basis for $V$. Let ($M$ \ $B_X$) be the basis of $Y$. Then, $V=X+Y$.

Added: The matrix $M$ should only have $n$ non-zero rows. What it meant by "perform[ing] Gaussian elimination" is just to list all the basis vectors in $B$ and those for $X$. Then there are linearly dependent vectors. So we use Gaussian elimination to kill those off. But we choose to kill off the basis vectors in the original $B$, leaving those of $X$ in our matrix $M$. So that $M$ contains the basis vectors of $X$ but has only $n$ entries/L.I, vectors.

$X\cap Y=\{0\}$ because the the bases for $X$ and $Y$ are linearly independent.

Therefore, the statement is true.

Comment: Firstly, I am not entirely sure that my argument is necessarily valid. Secondly, it seems to lack rigor.

Thanks in advance for any help!

  • 0
    @Jyrki: Yes, you are right, that is what I was thinking of too. :)2011-09-12

1 Answers 1

1

Here's a useful lemma:

Lemma. Let $\mathbf{V}$ be a vector space, and let $\mathbf{x}_1,\ldots,\mathbf{x}_m$ be an ordered list of vectors. Then the list is linearly dependent if and only if there is a $k$, $1\leq k\leq m$, such that $\mathbf{x}_k$ is a linear combination of $\mathbf{x}_1,\ldots,\mathbf{x}_{k-1}$.

Proof. If there is a $k$ such that $\mathbf{x}_k$ is a linear combination of $\mathbf{x}_1,\ldots,\mathbf{x}_{k-1}$, then the list is linearly dependent.

Conversely, suppose the list is linearly dependent. If $\mathbf{x}_1=\mathbf{0}$, then $\mathbf{x}_1$ is a linear combination of the previous vectors (since the empty sum equals $\mathbf{0})$. If $\mathbf{x}_1\neq\mathbf{0}$, then $\{\mathbf{x}_1\}$ is linearly independent. Now, since the full list is linearly dependent, there is a smallest $k$, $2\leq k\leq m$, such that $\{\mathbf{x}_1,\ldots,\mathbf{x}_{k-1}\}$ isl inearly independent, but $\{\mathbf{x}_1,\ldots,\mathbf{x}_k\}$ is linearly dependent. That means that $\mathbf{x}_k$ is a linear combination of $\mathbf{x}_1,\ldots,\mathbf{x}_{k-1}$, as desired. $\Box$

Now, let $\beta=\{\mathbf{x}_1,\ldots,\mathbf{x}_m\}$ be a basis for $X$. Define $\gamma_0=\emptyset$.

If $b_1\in\mathrm{span}(\beta)$, then let $\gamma_1=\gamma_0$ (that is, discard $b_1$). If $b_1\notin\mathrm{span}(\beta)$, then let $\gamma_1=\{b_1\}$.

If $b_2\in\mathrm{span}(\beta\cup\gamma_1)$, then let $\gamma_2=\gamma_1$ (discard $b_2$). If $b_2\notin\mathrm{span}(\beta\cup\gamma_1)$, then let $\gamma_2=\gamma_1\cup\{b_2\}$.

Continue this way: assuming you have dealt with $b_k$, if $b_{k+1}\in\mathrm{span}(\beta\cup\gamma_k)$, then let $\gamma_{k+1}=\gamma_k$. If $b_{k+1}\notin\mathrm{span}(\beta\cup\gamma_{k+1})$, then let $\gamma_{k+1}=\gamma_k\cup\{b_{k+1}\}$.

(What this amounts to is taking the list $\mathbf{x}_1,\ldots,\mathbf{x}_m,b_1,\ldots,b_n$, and then throwing away any vector which is a linear combination of the previous vectors.).

Now, at the end, $\beta\cup\gamma_n$ will be a basis for all of $V$, because it has the same span as $\beta\cup\{b_1,\ldots,b_n\}$. Moreover, because it is a basis, $\mathrm{span}(\beta)\cap\mathrm{span}(\gamma_n) = \{\mathbf{0}\}$. So $\mathrm{span}(\gamma_n)$ gives what you want.

Your idea is okay, but the way you describe leaves something to be desired: when you do row-reduction, you have no way to guarantee that the rows under the first $m$ that you have will still correspond to vectors $b_i$; those nonzero rows may be some other linear combinations, which makes it hard to argue that you can find a subset of the $b_i$ that will work. It can be made to work, but I would do it by placing the $\mathbf{x}_i$ and the $b_i$ as columns of a matrix, and then row reduce, making sure that the first $m$ columns contain pivots. Then look for the $b_i$ that correspond to columns with pivots in the final, row-reduced form of this matrix.