For a non-zero vector $\mathbf{v} \in{\mathbb{E}^n}$, I need to show that the collection $W$ of vectors orthogonal to $\mathbf{v}$ forms an (n-1)-dimensional subspace of $\mathbb{E}^n$. I've been working with a spanning set $\alpha_1\mathbf{w}_1 + \dotsb + \alpha_{n-1}\mathbf{w}_{n-1}$, but I'm having trouble trying to wrap my head around how to prove this is linearly independent or why it has to have dimension of $(n-1)$. Thanks
Finding a basis for an $(n-1)$-dimensional subspace of $\mathbb{E}^n$
-
1Do you have the [Gram-Schmidt process](http://en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process) available? – 2011-10-11
-
1Here's one way of approaching this: suppose $\{w_1, \ldots, w_k \}$ is a basis for $W$. Show that $\{w_1, \ldots , w_k, v\}$ is a basis for $\mathbb{E}^n$. Since every basis of $\mathbb{E}^n$ has $n$ elements, we have $k+1=n$. – 2011-10-11
-
0Unfortunately not, I've read about it but apparently this is solvable without using it... – 2011-10-11
-
0At least, then, skimming through Wikipedia's account of Gram-Schmidt ought to inspire an attack on the missing piece of Chris's hint. – 2011-10-11
4 Answers
Since $v$ is non-zero, then the set $\{v\}$, which is linearly independent, can be complete to a basis $B=\{v,w_1,\dots,w_n\}$ of $E^n$, which necessarily must have $n$ elements. Now consider the vectors $$w_i'=w_i-\frac{v\cdot w_i}{v\cdot v}v$$ for $i\in\{1,\dots,n\}$. An easy computation shows that the $w_i'$ are orthogonal to $v$. You should next check that they are linearly independent.
I think the simplest way is this. Let $${\bf v}=(a_1,a_2,\dots,a_n), {\bf w}=(w_1,w_2,\dots,w_n)$$ The orthogonality condition is a single homnogeneous linear equation in $n$ unknowns, $$a_1w_1+a_2w_2+\cdots+a_nw_n=0$$ Do you know how to find the dimension of the solution space of a system of homogeneous equations?
EDIT: if not, then see my comment, currently the third one down from this answer.
(Aside to Agusti Roig: perhaps I should have done as you suggested, but what I've done instead should still help, as well as saving a lot of typing!)
-
0no but this sounds very helpful – 2011-10-11
-
0Hint: apply the rank-nullity theorem. :-) – 2011-10-11
-
2Or, if you don't know the rank-nullity theorem, do this: at least one of the $a_i$ is not zero. Let's assume $a_n\ne0$. Then one solution has $w_1=1$, $w_n=-a_1/a_n$, and all the other $w_i=0$. A second solution has $w_2=1$, $w_n=-a_2/a_n$, all the other $w_i=0$. And so on, up to solution number $n-1$, which has $w_{n-1}=1$, $w_n=-a_{n-1}/a_n$, all the other $w_i=0$. So we get $n-1$ elements of $W$. It's easy to check that they are linearly independent, and it isn't too hard to show that every element of $W$ is a linear combination of these elements. – 2011-10-11
-
0@Gerry. You're right: your idea is the simplest one. But maybe you could add this comment of yours in your answer: perhaps it would help james. – 2011-10-11
You can do it in several ways.
First, you can consider a map
$$ f: \mathbb{E}^n \longrightarrow \mathbb{R} \ , \qquad f(u) = v\cdot u \ . $$
Prove that $f$ is linear. By the rank-nullity theorem, you'll have
$$ \mathrm{dim}\, \mathbb{E}^n = \mathrm{dim}\,\mathrm{Im}\, f + \mathrm{dim}\,\mathrm{Ker}\, f \ . $$
And now you must prove two more things: (1) $\mathrm{Im}\, f = \mathbb{R}$ (hint: consider vectors $u$ of the form $\lambda v$). And (2) $\mathrm{Ker}\, f = W$. As a consequence, you'll have all: that $W$ is a subspace and its dimension.
Another approach: if you know what an (internal) direct sum is, prove that
$$ \mathbb{E}^n = \mathrm{span} (v) \oplus W \ . $$
For this, you'll need to: (1) Prove that $W$ is a subspace (hint: just look at the definition of subspace). (2) Prove that $\mathrm{span} (v) \cap W = \left\{ 0\right\} $. And (3), the tough part of this, prove that you can write every vector $u \in \mathbb{E}^n$ as $u = \lambda v + w$, for some $\lambda \in \mathbb{R}$ and $w\in W$. This can be done easily if you know about orthogonal projections. In this case, $\lambda v$ is just the orthogonal projection of $u$ onto $\mathrm{span} (v)$ and $w = u -\lambda v$. (Hint: $\lambda = \dfrac{v\cdot u}{v\cdot v}$.)
As a consequence, $\mathrm{dim}\, \mathbb{E}^n = \mathrm{dim}\,\mathrm{span} (v) + \mathrm{dim}\, W$ and you're done.
Suppose: A = {x$_1$, x$_2$,...,x$_{n-1}$, v} is an orthogonal basis for R$^n$ (this assumption is fine because you can just take n-1 vectors that are orthogonal to each other and v to form a basis for R$^n$). That means that any vector in R$^n$ can be expressed as a linear combination of the vectors in A. Therefore, any vector orthogonal to v in R$^n$ can be expressed as a$_1$x$_1$+ a$_2$x$_2$+...+a$_{n-1}$x$_{n-1}$+ 0 v. This means that {x$_1$, x$_2$,...,x$_{n-1}$} is a basis for all vectors orthogonal to v in R$^n$. That's why the orthogonal space has dimension n-1. After that it's just a matter of closing this set of vectors under scalar multiplication and vector addition, which isn't too bad.
-
0subscripts didn't work for me i guess – 2017-07-27
-
1use dollar sign to encapsulate mathjax syntax – 2017-07-27