1
$\begingroup$

sorry it's me again!

Let $V$ be a $\mathbb K$ vector space with finite basis $B$ and $M \subset V$ is a finite linear independent subset.

Show that a subset $A \subset B$ exists, so that $(B \backslash A) \cup M$ is a basis from $V$.


Can I cheat and use the subset $\{\}$ from $B$? This would get me $B \cup M$, which should be a basis from V. Or did I get everything wrong? It's weird, because apparently my colleagues used linear span to show this.

Thanks a lot again!

  • 3
    The definition of basis requires elements to be linearly independent. If you used $B \cup M$, and $M$ is nonempty, then that can't possibly be a basis (every element in $M$ is in the span of $B$).2011-12-11
  • 0
    $B$ is finite, right?2011-12-11
  • 0
    B is finite, @EwanDelanoy2011-12-11

1 Answers 1

3

Brandon makes a good point about your "cheat" in the comments. We'll have to work harder than that.

Write $B = \{b_1, \ldots, b_n\}$ and $M = \{m_1, \ldots, m_k\}$. For each $0 \leq r \leq k$, consider the following statement.

After possibly relabeling the $b_i$, the set \[ \{m_1, \ldots, m_r, b_{r + 1}, \ldots, b_n\} \] is a basis for $V$.

Prove this by induction on $r$. When $r = 0$, this simply says that $B$ is a basis. By way of induction, assume that we've proven the statement for an $r < k$. Then \[ m_{r + 1} = a_1m_1 + \cdots + a_rm_r + a_{r + 1}b_{r + 1} + \cdots + a_nb_n. \] Note that in this expression some $b_i$, which might as well be $b_{r + 1}$, has a non-zero coefficient; otherwise, we would have a relation of linear dependence among elements of $M$. Now you can write $b_{r + 1}$ as an element of the span of $\{m_1, \ldots, m_{r + 1}, b_{r + 2}, \ldots, b_n\}$, and then show that this new set is a basis.

  • 0
    hi dylan, thanks for the answer! do you mean, then $m_1$ can be written as a linear combination of elements of $B_1$? or did I get something wrong? Another question, can I just show that when A=M, then the statement I want to show is true? Or I have to show it for every M?2011-12-11
  • 0
    @Clash I think what I have here is right. You want to replace $b_1$ by $m_1$ and keep on doing that. This sort of proof appears to [have a name](http://en.wikipedia.org/wiki/Steinitz_exchange_lemma). I don't see why you want to assume that $A = M$, could you explain? I can write more details later, if you like.2011-12-11
  • 0
    ok, so I think I understood better what you mean. I think it's no mistake what you wrote. Let me see if I can show that $B_1$ is a basis for V2011-12-11
  • 0
    Dylan, about A=M, then I get (B\A) U M, but as A = M, then I get B again.2011-12-11
  • 0
    @Clash I don't see any reason why you can assume that. I'll try to be more explicit about the structure of the proof soon.2011-12-11
  • 0
    but I guess I do see, that because a "part" of the original $b_1$ is in $m_1$, you can use the other vectors in $B_1$ to actually get $b_1$ back and then I could actually retrieve the original basis $B$. Did I get this right? But how I could generalize it for the other cases? I can't write again $m_2=a_1b_1 + \cdots + a_nb_n$, can I? Then $m_1$ would be equal to $m_2$2011-12-11
  • 0
    Dylan I'm still very interested in your expanded answer if you're still up for it! many thanks in advance2011-12-11
  • 0
    @Clash Hope this helps. I always thought this was a somewhat awkward proof/setup, but I've yet to find anything better.2011-12-12
  • 0
    Dylan, this was very useful, thanks a lot! They gave the solution to this exercise today and a student actually found a flaw on the solution given, so I guess it isn't easy at all to prove this.2011-12-12