5
$\begingroup$

Given ${u, v, w}$ is a basis for $\mathbb{R}^3$, how can I show that $\{u + v + w, v + w, w\}$ is also a basis?

I solved a similar problem in $\mathbb{R}^2$ (or at least think I did :p).

Given $\{u, v\}$ is a basis for $\mathbb{R}^2$, show that $\{u + v, au\}$ is also a basis.

I used the definition of a span to say

$\implies cu + dv =\langle x, y\rangle $ (for $c, d$ in $\mathbb{R}$)

Let $c = d + a^2$ (for $a$ in $\mathbb{R}$)

$\implies (d + a^2)u + dv = \langle x, y\rangle $

$\implies d(u + v) + a(au) = \langle x, y\rangle $

$\implies \mathrm{span}(u + v, au) = \mathbb{R}^2$

From there I also showed that this set was linearly independent (by starting with putting $c$ and $d$ equal to zero) and concluded that, as it had both properties in $\mathbb{R}^2$, it must be a basis.

So I've been wracking my brains trying to find ways to manipulate coefficients to achieve the new basis for $\mathbb{R}^3$, but I can't come up with anything.

I'd like to know if the method I employed for the $\mathbb{R}^2$ question is acceptable? Is this the only way to do it? Is there another method that I should be using?

P.S I apologise for my imperfect formatting. Still learning. Somehow I kept collapsing all the spaces between symbols.

  • 0
    Let me be the first to strongly suggest that you disassociate any emotions/spirits from anything that happens here on MSE. The conditions listed when you hover over the voting arrows are $\mathrm{Research effort, clarity, and usefulness}$, none of which are strongly associated with highly up(down)-voted questions.2012-03-08

6 Answers 6

3

The following "answer" applies to the heading of your question.

A hint, show that they are independent, i.e. let $c_1,c_2,c_3$ be scalars such that $c_1(u+v+w)+c_2(v+w)+c_3w=0$. This implies that $c_1u+(c_1+c_2)v+(c_1+c_2+c_3)w=0$.Now by independence of $u,v,w$ what can you conclude?

Also use the fact that a set of $n$ vectors in a $n$-dimensional vector space is a basis if and only if the set is independent.

  • 2
    More shortcuts! Sweet! :D2012-03-08
2

Take the cocordinate vectors on the old basis , that are (1,1,1), (0,1,1) , (0,0,1).They are linear independent since if you put them as rows of a matrix the rank is three( they are in echelon form). Then : vectors are linear independent iff their cocordinates in a basis are linear independent. Then, you have 3 linear independent vectors in a space of dimension 3, so they are a basis of the space.

  • 0
    Yes : the old basis was u,v,w.The new three vectors u+v+w, v+w, and w have as coordinates in the old basis the vectors: (1,1,1), (0,1,1) , (0,0,1).They are clearly linear independent (if need you can prove).There is a theorem that says that the set of coordinates of vectors in a given basis is lineary independent if and only if the vectors are lineary independent.2012-03-09
1

Let $A=\{u,v\}$ and $B=\{u+v, au\}$ for some $a\ne 0$. Then, one can write the elements of $A$ as a linear combination of the elements from $B$ and vice versa: $u = a^{-1} \cdot au$ and $v = 1\cdot(u+v) - a^{-1}\cdot au$. The converse is obvious. Now, this means that both $A$ and $B$ generate the same vector space and have the same cardinality, so if one is a basis, the other one also is. If $a=0$, $B$ cannot be a basis.

You can easily apply the same argument in the $\mathbb{R}^3$ case.

  • 0
    "number of elements" (in this case)2012-03-08
1

In your example showing that if $\{ u,v \}$ is a basis for $\mathbb{R}^2$, then so is $\{ u+v , au \}$ (where $a \neq 0$; something you seem to have forgotten), there is something that seems off, although I can't quite place it. The basic strategy is as follows (and I'll try phrase it in as general a manner as possible):

  • Spanning: Given any $w \in \mathbb{R}^2$, we know there are scalars $b,c$ such that $w = bu + cv$. We now want to find scalars $r,s$ such that $bu+cv = w = r(u+v) + s(au) = (r+sa)u + rv$. Since we have started with a basis, it must be that $b = r+sa$ and $c = r$. Solve these equations. (This amounts to solving a system of linear equations in the unknowns $r,s$.)
  • Linear independence: Suppose that $r,s$ are scalars such that $0 = r(u+v) + s(au) = (r+as)u + rv$. Use linear independence of the original basis to conclude that $r = s = 0$.
  • 0
    Looking at it again, I think that if I need the combination -4u + -3v, then d must be equal to -3, in which case I need $a^2$ to equal -1, which it can't. So my rearrangement won't hold for the whole space.2012-03-08
1

1) Show that $u+v+w, v+w, w$ span $\mathbb{R}^3$ by showing that $u,v,w\in$

2) For linear independence assume that $ a\cdot(u+v+w) + b\cdot(v+w) + c\cdot w = 0$ Can you reduce that to the case of an equation in $u,v,w$?

  • 0
    From your 1) it looks like I can start thinking about span as "Can I make all 3 old basis vectors by combining these new vectors?" instead of "Can I make any out of these new vectors?" And now that I write that out, I see they are obviously the same thing. Thanks for bringing that into focus for me!2012-03-08
1

The calculation you used for $R^2$ is a bit suspect.

In any case, this is a good time to use the idea of "rank". span{u+v, au} is contained in span{u,v} (because u+v and au are). Therefore, they are actually equal iff they have the same rank.

Alternatively, if u and v are in span{u+v, au}, then all of span{u,v} is in span{u+v, au}. (and thus they are equal, because we already know the other direction)

Either way, it's easy to set these questions up as questions about matrices, and then you can apply all of your matrix expertise to answer the question!

  • 0
    The rank of a matrix is the same thing as the dimension of its row space, which, in turn, is the span of its row vectors. (the same is true for columns. One of the neat theorems of linear algebra is that you get the same number both ways)2012-03-08