1
$\begingroup$

I'm writing a linear algebra exam soon and I'd like to know a fast way to solve a task like that:

$$u=\begin{pmatrix} 0\\ 0\\ 1 \end{pmatrix}, v= \begin{pmatrix} 1\\ 0\\ 1 \end{pmatrix}, w = \begin{pmatrix} 0\\ 1\\ 1 \end{pmatrix}$$

Show that $(u,v,w)$ is a basis of the vector space $\mathbb{R}^{3}.$

I would start by checking if these vectors are linearly independent. I do this by checking determinant $\neq 0$:

enter image description here

Using Sarrus rule, we know that determinant is $1$ and thus these vectors are linearly independent.

Now let $x,y,z \in \mathbb{R}$

$$\begin{pmatrix} x\\ y\\ z \end{pmatrix}=\lambda_{1}\begin{pmatrix} 0\\ 0\\ 1 \end{pmatrix}+ \lambda_{2}\begin{pmatrix} 1\\ 0\\ 1 \end{pmatrix}+ \lambda_{3}\begin{pmatrix} 0\\ 1\\ 1 \end{pmatrix}$$

$$x = \lambda_{2}$$

$$y = \lambda_{3}$$

$$z = \lambda_{1}+\lambda_{2}+\lambda_{3}$$

The solution is unique too and thus $(u,v,w)$ is a basis of the vector space $\mathbb{R}^{3}$


Did I solve the task correctly and can you tell me better ways if there are some?

  • 1
    You can use the Gaussian elimination, get three lines which are not equal to zero, conclude that they are linear independent, and you got three vectors in a space of dimension 3, so it is a basis of the space.2017-02-26
  • 1
    The second part is enough since it gives a unique solution. (No solutions and it wouldn't be a basis because it wouldn't have enough vectors, more solutions and it wouldn't be a basis because there would be too many solutions). Also, I'd say that the first part is also enough on its own since it's probably known to you that any set in $\mathbb R^3$ with more than three vectors is linearly dependent.2017-02-26
  • 0
    @Itay4 Thank you I didn't know that was possible! Would it also work with other matrices, such as $3 \times 2$?2017-02-26
  • 0
    @GitGud Would the second part really be enough? Because we cannot know if $\lambda_{1},\lambda_{2},\lambda_{3}$ equal zero or not. So I did the determinant thing first to know that it's not linearly dependent.2017-02-26
  • 1
    For the second part, if the solution is unique, then the coefficient matrix has full rank, so its columns (your three vectors) are linearly independent. Another way of looking at it is that for there to be any solution to this system of equations, $(x,y,z)^T$ must lie in the column space of the coefficient matrix. You’ve shown that this column space consists of *all* or $\mathbb R^3$, which implies linear independence of the columns.2017-02-26

4 Answers 4

3

Once you have proved that the $3$ vectors are linearly independent, you automatically have that they are a basis for $\mathbb{R}^3$, since they generate a subspace with dimension $3$ of a space of dimension $3$ - so they must generate the entire space! As for proving linear independence, the determinant approach proposed in the question is general and works well.

In this particular case, a simpler approach is to see that $v-u=(1 0 0)^T$, $w-u=(010)^T$, $u=(001)^T$ form what is called the canonical basis of $\mathbb{R}^3$, so $u,v,w$ must also form a basis of $\mathbb{R}^3$.

3

You were done the second you proved they were l.i., since if you have n l.i. vectors of a given space of dimension n, then those vectors form a base.

2

Your answer is correct.

Any $3$ linearly independent vectors in a $3$-dimensional vector space are a basis for that vector space. You can check this, as you did correctly, by calculating that determinant. Notice that when you have a more complex $3$-dimensional vector space where vectors are for example functions, you can perform the same trick using the coordinates of those functions relative to a certain basis you do know. If the determinant of those coordinates is non zero, you have found a basis as well!

  • 0
    Thanks! Some people said I can stop after this step (after knowing that determinant $\neq 0$). Can you confirm that?2017-02-26
  • 1
    That's true. If the determinant of a matrix is non zero, then the columns are linearly independent. As the columns represent vectors, it is easy to see that the vectors are linearly independent.2017-02-26
2

Computing a determinant will always work to determine linear independence, but in some cases you can determine this by inspection. By definition, a set of vectors is linearly independent if the only linear combination of them that sums to zero has all zero coefficients. Observe that there’s no way to generate a non-zero first component of a linear combination of $u$ and $v$, so the coefficient of $v$ in any linear combination that produces $0$ must be $0$. A similar observation shows that the coefficient of $w$ must also be zero, but that means that the coefficient of $u$ must be zero as well to eliminate the third component of the sum. As others have pointed out, at this point you’re done because by the definition of dimension, this set of three linearly independent vectors must span the entire space.