2
$\begingroup$

Without using dimension, I have to show

From: question

$\overrightarrow{x} \in \mathbb{R}^3$ is also in span $\{v_1, v_2, n\}$ where $n = v_1 \times v_2$. Where $v_1, v_ 2\in \mathbb{R}^3$ as well.

We can let $x = \langle x_1, x_2, x_3 \rangle$, but that isn't very helpful.

I know the set is linearly independent, could that be used? Should I write their sum in the form of components?

5 Answers 5

0

And yet another approach ...

Again, we're going to show that if the three vectors $\mathbf{u}$, $\mathbf{v}$, $\mathbf{w}$ are not coplanar, then they span $\mathbb{R}^3$.

If $\mathbf{u} \cdot (\mathbf{v} × \mathbf{w}) \ne 0$, we can define three magic new vectors $\tilde{\mathbf{u}}$, $\tilde{\mathbf{v}}$, $\tilde{\mathbf{w}}$ by the equations $$ \tilde{\mathbf{u}} = \frac{\mathbf{v} × \mathbf{w}}{\mathbf{u} \cdot (\mathbf{v} × \mathbf{w})} \quad ; \quad \tilde{\mathbf{v}} = \frac{\mathbf{w} × \mathbf{u}}{\mathbf{u} \cdot (\mathbf{v} × \mathbf{w})} \quad ; \quad \tilde{\mathbf{w}} = \frac{\mathbf{u} × \mathbf{v}}{\mathbf{u} \cdot (\mathbf{v} × \mathbf{w})} $$ These three vectors are called the dual vectors of $\mathbf{u}$, $\mathbf{v}$, $\mathbf{w}$, or the reciprocal vectors.

If $\mathbf{x}$ is any given vector, it is straightforward to verify that $$ \mathbf{x} = (\mathbf{x} \cdot \tilde{\mathbf{u}})\mathbf{u} + (\mathbf{x} \cdot \tilde{\mathbf{v}})\mathbf{v} + (\mathbf{x} \cdot \tilde{\mathbf{w}})\mathbf{w} $$ This shows that $\mathbf{x}$ lies in the span of $\{\mathbf{u}, \mathbf{v}, \mathbf{w} \}$, and even gives you explicit formulae for the coefficients. This is cute, but it doesn't give you much insight into the geometry of the situation.

The last equation giving $\mathbf{x}$ is just a disguised statement of Cramer's rule, so this approach is strongly related to the previous one.

3

I don't know why you wouldn't want to use any dimension arguments. Let me outline the most common approach.

General fact: Let $V$ be a finite-dimensional vector space, say $\dim(V)=n$. Suppose that $\left\{v_1,v_2, \dots , v_n\right\}$ is a linearly independent subset, then this set is generating.

This statement is covered in any standard textbook (and course) on linear algebra. The proof of this fact is easy once you know what the dimension of a vector space is. But to properly define the dimension of a vector space, you first need to know that given two different bases of the same vector space, they have the same amount of vectors. The proof of the latter fact essentially boils down to the lemma of Steinitz.

Having said that, let's return to the problem at hand. You have three vectors in $\mathbb{R}^3$ which are linearly independent. Since $\dim(\mathbb{R}^3)=3$, the above discussions yields that thee linearly independent vectors automatically form a basis.

Without some explicit information on $v_1,v_2$, it's going to be difficult to show that the set is generating directly. I'm not sure whether you even can give a "direct" argument. Showing that a set is generating directly is often more difficult than showing linear independence, this technique of adding the dimension into the discussion is often useful to avoid having to do that explicitly (which might not be possible).

  • 0
    Im not allowed to use dimension2017-01-22
  • 0
    Not my fault. The profs said no, so its not up to me anymore2017-01-22
3

An approach: if $\{\mathbf{u},\mathbf{v},\mathbf{u\times v}\}$ is linearly independent then, for all $\mathbf{x}\in\mathbb{R}^3$, the system $$\mathbf{x}=\lambda_1\mathbf{u}+\lambda_2\mathbf{v}+\lambda_3(\mathbf{u\times v})$$ has solution (besides, unique solution) because $$\text{rank }[\mathbf{u},\mathbf{v},\mathbf{u\times v}]=\text{rank }[\mathbf{u},\mathbf{v},\mathbf{u\times v},\mathbf{x}]=3$$ that is, $\mathbf{x}\in \text{Span }\{\mathbf{u},\mathbf{v},\mathbf{u\times v}\}.$

  • 0
    Small remark: You can stop when you have $\text{rank }[\mathbf{u},\mathbf{v},\mathbf{u\times v}]=3$. This exactly says that the matrix $[\mathbf{u},\mathbf{v},\mathbf{u\times v}]$ is invertible. Hence $[\mathbf{u},\mathbf{v},\mathbf{u\times v}]\mathbf{\lambda}=\mathbf{x}$ yields $\mathbf{\lambda}=[\mathbf{u},\mathbf{v},\mathbf{u\times v}]^{-1}\mathbf{x}$. Which is what is needed. Then afterwards you can conclude that $\text{rank }[\mathbf{u},\mathbf{v},\mathbf{u\times v},\mathbf{x}]=3$ since $\mathbf{x}$ depends on the other guys. +1 for not mentioning dimensions.2017-01-22
  • 0
    Right, but the problem is that we are "corseted" by Amad27's teacher :). Unless we have an exhaustive list of results we can use or not, any approach could be remarked $\to +\infty$.2017-01-22
  • 0
    .... by the way, +1 for your anwer.2017-01-22
  • 0
    @Mathematician42, rank is also banned. We haven't done matrices yet2017-01-22
  • 0
    @Amad27 This seems to be a clear example of systematic torture.2017-01-22
  • 0
    @FernandoRevilla, I'll agree -- but I suppose the faculty is just trying to move everything slowly? Thanks for teaching these techniques though.2017-01-22
  • 0
    @Amad27 Don't worry, not your fault.2017-01-22
  • 0
    Out of curiosity (I really like this solution), how does showing that their ranks are = 3 mean a solution exists? How did you calculate rank?2017-01-22
  • 0
    The rank of a matrix is the number of independent columns (or rows, both yield the same number). In this case the matrix has the independent vectors $\mathbf{u},\mathbf{v},\mathbf{n}$ for the columns. Thus the rank is $3$. If $A\in \mathbb{R}^{n\times n}$ has rank $n$, then $A$ is invertible. Thus given a system of equations $AX=b$, one can solve $X$ as $X=A^{-1}b$.2017-01-22
  • 0
    @Mathematician42, I looked ahead in my textbook and the definition of rank is the number of leading 1's in the matrix. I am not aware of the definition you are talking about.2017-01-22
  • 0
    @Mathematician42, also, how would you write the coefficient matrix for this anyway?2017-01-22
  • 0
    The number of leading ones in the row reduced form is the same as what I said. This follows from the fact the Gauss-reduction preserves the row space. You can the the matrix of the system by simply writing down the equations for each component.2017-01-22
  • 0
    @Mathematician42, another definition was, it is the maximal number of linearly independent vectors. So in our case, all of them are LI so rank = 3 by this definition? Also what is the difference between rank of $[A]$ and rank of $[A| b]$?2017-01-22
3

I'll use the notation from the previous answer, since I don't like subscripts. We are given two vectors $\mathbf{u}$ and $\mathbf{v}$, and we want to show that any given vector $\mathbf{x}$ is in the span of $\{\mathbf{u}, \mathbf{v}, \mathbf{n}\}$, where $\mathbf{n} = \mathbf{u} \times \mathbf{v}$.

First let's focus on the point $\mathbf{w} = p \mathbf{u} + q \mathbf{v}$ that is the foot of the perpendicular from $\mathbf{x}$ to the plane spanned by $\mathbf{u}$ and $\mathbf{v}$. The scalars $p$ and $q$ are unknown, as yet -- we have to find them. To get perpendicularity, we need $p$ and $q$ to satisfy the equations $$(\mathbf{x} - \mathbf{w}) \cdot \mathbf{u} = (\mathbf{x} - p \mathbf{u} - q \mathbf{v}) \cdot \mathbf{u} = 0 $$ $$ (\mathbf{x} - \mathbf{w}) \cdot \mathbf{v} = (\mathbf{x} - p \mathbf{u} - q \mathbf{v}) \cdot \mathbf{v} = 0 $$ In other words, $p$ and $q$ that are solutions of the linear system $$ p (\mathbf{u} \cdot \mathbf{u}) + q (\mathbf{u} \cdot \mathbf{v}) = \mathbf{x} \cdot \mathbf{u} $$ $$ p (\mathbf{u} \cdot \mathbf{v}) + q (\mathbf{v} \cdot \mathbf{v}) = \mathbf{x} \cdot \mathbf{v} $$ A solution exists because the determinant of this system is $(\mathbf{u} \cdot \mathbf{v})(\mathbf{v} \cdot \mathbf{v}) - (\mathbf{u} \cdot \mathbf{v})^2$, which is positive unless $\mathbf{u}$ and $\mathbf{v}$ are linearly independent. Or, you can just see that the system has a solution by calculating it via Cramer's rule.

Since $\mathbf{x} - \mathbf{w}$ is orthogonal to both $\mathbf{u}$ and $\mathbf{v}$, it must be parallel to $\mathbf{n} = \mathbf{u} \times \mathbf{v}$. So, there is a scalar $k$ such that $\mathbf{x} - \mathbf{w} = k \mathbf{n}$. But then $$ \mathbf{x} = \mathbf{w} + k \mathbf{n} = p \mathbf{u} + q \mathbf{v} + k \mathbf{n} $$ which means that $\mathbf{x}$ lies in the span of $\{\mathbf{u}, \mathbf{v}, \mathbf{n}\}$.

  • 0
    Your approach is elegant. (+1).2017-01-22
  • 0
    Thanks. It's just 3D geometry, which is what you have to use if more advanced methods are prohibited. The question is intended to build geometric intuition, I suppose.2017-01-22
  • 0
    I'm going to nitpick. "Since $\mathbf{x}-\mathbf{w}$ is orthogonal to both $\mathbf{u}$ and $\mathbf{v}$, it must be parallel to $\mathbf{n}$" Why? Here you seem to use that the orthogonal complement of $\text{Span}\left\{\mathbf{u}, \mathbf{v}\right\}$ is one-dimensional. But how do you know that?2017-01-22
  • 0
    In fact you are first considering the orthogonal projection onto the $2$-dimensional space spanned by $\mathbf{u}$ and $\mathbf{v}$. You find this by saying a certain determinant is non-zero, which is the same as saying that the rank of matrix in the system of equations is full. So that's the same argument as the other answer. (Which we are not allowed to use apparantly). After projecting on this $2$-dimensional subspace you state there is only one dimension left. I don't think your answer is bad, I'm just saying that these restrictions are idiotic.2017-01-22
  • 0
    I agree that my argument is basically the same as yours. It's just that it doesn't mention matrices or rank.2017-01-22
  • 0
    Regarding your other comment ... we know that $\mathbf{n}$ is orthogonal to both $\mathbf{u}$ and $\mathbf{v}$, by the definition of the cross product. If $\mathbf{x} - \mathbf{w}$ has this same property, it must be parallel to $\mathbf{n}$. We're working in 3D space, so geometric arguments like this seem valid (I hope).2017-01-22
  • 0
    Well it works since the cross product gives you the unique one-dimensional orthogonal complement of a two-dimensional subspace. But the fact that this complement is one-dimensional needs a proof. Such a proof is not difficult, but now we cannot use dimensions or ranks.2017-01-22
  • 0
    Well anyway, +1 to both since I do believe that the teacher was indeed after a more geometric argument. (He basically forbids any other arguments :) )2017-01-22
2

Another approach ...

We can actually show something more general: if the three vectors $\mathbf{u}$, $\mathbf{v}$, $\mathbf{w}$ are not coplanar, then they span $\mathbb{R}^3$. Here's how:

Given a vector $\mathbf{r}$, we need to show that it can be written as a linear combination of $\mathbf{u}$, $\mathbf{v}$, $\mathbf{w}$. In other words, we have to find scalars $x$, $y$, $z$ such that $$ x\mathbf{u} + y\mathbf{v} + z\mathbf{w} = \mathbf{r} $$ This is a linear system of equations, which can be written $$ \left[\begin{matrix} \leftarrow & \mathbf{u} & \rightarrow \\ \leftarrow & \mathbf{v} & \rightarrow \\ \leftarrow & \mathbf{w} & \rightarrow \\ \end{matrix} \right] \left[ \begin{matrix} x \\ y \\ z \\ \end{matrix}\right] = \mathbf{r} $$ The system has a solution provided the determinant of its matrix is non-zero. But this determinant is just $\mathbf{u} \cdot (\mathbf{v} \times \mathbf{w})$. This triple product measures the volume of the parallelipiped having $\mathbf{u}$, $\mathbf{v}$, $\mathbf{w}$ as edges, so it will be non-zero if $\mathbf{u}$, $\mathbf{v}$, $\mathbf{w}$ are not coplanar.

If you don't know anything about matrices and determinants, you can just apply Cramer's rule to solve the linear system. Again, you will see that the system has a unique solution provided $\mathbf{u} \cdot (\mathbf{v} \times \mathbf{w}) \ne 0$.