2
$\begingroup$

These vectors form a basis on $\mathbb R^3$: $\begin{bmatrix}1\\0\\-1\\\end{bmatrix},\begin{bmatrix}2\\-1\\0\\\end{bmatrix} ,\begin{bmatrix}1\\2\\1\\\end{bmatrix}$

Can someone show how to use the Gram-Schmidt process to generate an orthonormal basis of $\mathbb R^3$?

  • 0
    I can't show what I have as it is a huge mess of eraser marks. Trust me I've tried applying what my teacher "taught" me. I came here to hopefully LEARN the real process of Gram-Schmidt.2011-12-05

2 Answers 2

1

Let $u_1=\begin{bmatrix}1\\0\\-1\\\end{bmatrix} ,u_2=\begin{bmatrix}2\\-1\\0\\\end{bmatrix} ,u_3=\begin{bmatrix}1\\2\\1\\\end{bmatrix}$. To find the required orthonormal basis $\{w_1,w_1,w_3\}$, first we have $w_1=\frac{u_1}{\|u_1\|}=\begin{bmatrix}\frac{1}{\sqrt{2}}\\0\\-\frac{1}{\sqrt{2}}\\\end{bmatrix}.$

Second, find $u_2-(w_1\cdot u_2)w_1$ as follows: $u_2-(w_1\cdot u_2)w_1=\begin{bmatrix}2\\-1\\0\\\end{bmatrix}-\sqrt{2}\begin{bmatrix}\frac{1}{\sqrt{2}}\\0\\-\frac{1}{\sqrt{2}}\\\end{bmatrix}=\begin{bmatrix}1\\-1\\1\\\end{bmatrix}.$ By taking the dot product, you can see that $w_1$ is orthogonal to the above vector: $w_1\cdot[u_2-(w_1\cdot u_2)w_1]=w_1\cdot u_2-(w_1\cdot u_2)w_1\cdot w_1=0$ since $w_1$ is an unit vector. So we can take $w_2=\frac{u_2-(w_1\cdot u_2)w_1}{\|u_2-(w_1\cdot u_2)w_1\|}=\begin{bmatrix}\frac{1}{\sqrt3}\\-\frac{1}{\sqrt3}\\\frac{1}{\sqrt3}\\\end{bmatrix}.$

Finally, find $u_3-(w_1\cdot u_3)w_1-(w_2\cdot u_3)w_2$ as follows: $u_3-(w_1\cdot u_3)w_1-(w_2\cdot u_3)w_2=\begin{bmatrix}1\\2\\1\\\end{bmatrix}-0\cdot\begin{bmatrix}\frac{1}{\sqrt3}\\-\frac{1}{\sqrt3}\\\frac{1}{\sqrt3}\\\end{bmatrix}-0\cdot\begin{bmatrix}\frac{1}{\sqrt{2}}\\0\\-\frac{1}{\sqrt{2}}\\\end{bmatrix}=\begin{bmatrix}1\\2\\1\\\end{bmatrix}.$ By taking the dot product, you can again see that $w_1$ and $w_2$ and is orthogonal to the above vector. So we can take $w_3=\frac{u_3-(w_1\cdot u_3)w_1-(w_2\cdot u_3)w_2}{\|u_3-(w_1\cdot u_3)w_1-(w_2\cdot u_3)w_2\|}=\begin{bmatrix}\frac{1}{\sqrt6}\\\frac{2}{\sqrt6}\\\frac{1}{\sqrt6}\\\end{bmatrix}.$

  • 0
    Awesome Paul... FINALLY I can understand.2011-12-05
1

Let's look at this in two dimensions first. After this you should know how to do it in three!

Suppose that you are working in the plane and have two linearly independent vectors $v$ and $w$. You want to make $v$ and $w$ orthogonal to each other in terms of the standard euclidean inner product. How can you do it?

Well notice that we can subtract a certain multiple of $w$ from $v$, let's call it $cw$ where $c$ is some constant so that $v - cw$ will be orthogonal to $w$. In other words, after some algebraic manipulation we find that $c$ must be equal to

$\frac{\langle v,w \rangle }{\langle w,w \rangle}$

simply by solving the equation $\langle v - cw, w \rangle = 0$ for $c$. Then now you will have an orthogonal basis for $\mathbb{R}^2$, namely the vectors

$w \quad \text{and} \quad v - \frac{\langle v,w \rangle }{\langle w,w \rangle} w.$

To find an orthonormal basis, you just need to divide through by the length of each of the vectors.

In $\mathbb{R}^3$ you just need to apply this process recursively as shown in the wikipedia link in the comments above. However you first need to check that your vectors are linearly independent! You can check this by calculating the determinant of the matrix whose columns are the vectors that you have stated in your question.

  • 0
    @jmendegan Just a general comment; if you are stuck again with questions like that e.g. using some method to calculate something I suggest using it in small cases for example in two dimensions instead of three.2011-12-05