1
$\begingroup$

For a non-zero vector $\mathbf{v} \in{\mathbb{E}^n}$, I need to show that the collection $W$ of vectors orthogonal to $\mathbf{v}$ forms an (n-1)-dimensional subspace of $\mathbb{E}^n$. I've been working with a spanning set $\alpha_1\mathbf{w}_1 + \dotsb + \alpha_{n-1}\mathbf{w}_{n-1}$, but I'm having trouble trying to wrap my head around how to prove this is linearly independent or why it has to have dimension of $(n-1)$. Thanks

  • 0
    At least, then, skimming through Wikipedia's account of Gram-Schmidt ought to inspire an attack on the missing piece of Chris's hint.2011-10-11

4 Answers 4

2

Since $v$ is non-zero, then the set $\{v\}$, which is linearly independent, can be complete to a basis $B=\{v,w_1,\dots,w_n\}$ of $E^n$, which necessarily must have $n$ elements. Now consider the vectors w_i'=w_i-\frac{v\cdot w_i}{v\cdot v}v for $i\in\{1,\dots,n\}$. An easy computation shows that the w_i' are orthogonal to $v$. You should next check that they are linearly independent.

2

I think the simplest way is this. Let ${\bf v}=(a_1,a_2,\dots,a_n), {\bf w}=(w_1,w_2,\dots,w_n)$ The orthogonality condition is a single homnogeneous linear equation in $n$ unknowns, $a_1w_1+a_2w_2+\cdots+a_nw_n=0$ Do you know how to find the dimension of the solution space of a system of homogeneous equations?

EDIT: if not, then see my comment, currently the third one down from this answer.

(Aside to Agusti Roig: perhaps I should have done as you suggested, but what I've done instead should still help, as well as saving a lot of typing!)

  • 0
    @Gerry. You're right: your idea is the simplest one. But maybe you could add this comment of yours in your answer: perhaps it would help james.2011-10-11
1

You can do it in several ways.

First, you can consider a map

$ f: \mathbb{E}^n \longrightarrow \mathbb{R} \ , \qquad f(u) = v\cdot u \ . $

Prove that $f$ is linear. By the rank-nullity theorem, you'll have

$ \mathrm{dim}\, \mathbb{E}^n = \mathrm{dim}\,\mathrm{Im}\, f + \mathrm{dim}\,\mathrm{Ker}\, f \ . $

And now you must prove two more things: (1) $\mathrm{Im}\, f = \mathbb{R}$ (hint: consider vectors $u$ of the form $\lambda v$). And (2) $\mathrm{Ker}\, f = W$. As a consequence, you'll have all: that $W$ is a subspace and its dimension.

Another approach: if you know what an (internal) direct sum is, prove that

$ \mathbb{E}^n = \mathrm{span} (v) \oplus W \ . $

For this, you'll need to: (1) Prove that $W$ is a subspace (hint: just look at the definition of subspace). (2) Prove that $\mathrm{span} (v) \cap W = \left\{ 0\right\} $. And (3), the tough part of this, prove that you can write every vector $u \in \mathbb{E}^n$ as $u = \lambda v + w$, for some $\lambda \in \mathbb{R}$ and $w\in W$. This can be done easily if you know about orthogonal projections. In this case, $\lambda v$ is just the orthogonal projection of $u$ onto $\mathrm{span} (v)$ and $w = u -\lambda v$. (Hint: $\lambda = \dfrac{v\cdot u}{v\cdot v}$.)

As a consequence, $\mathrm{dim}\, \mathbb{E}^n = \mathrm{dim}\,\mathrm{span} (v) + \mathrm{dim}\, W$ and you're done.

0

Suppose: A = {x$_1$, x$_2$,...,x$_{n-1}$, v} is an orthogonal basis for R$^n$ (this assumption is fine because you can just take n-1 vectors that are orthogonal to each other and v to form a basis for R$^n$). That means that any vector in R$^n$ can be expressed as a linear combination of the vectors in A. Therefore, any vector orthogonal to v in R$^n$ can be expressed as a$_1$x$_1$+ a$_2$x$_2$+...+a$_{n-1}$x$_{n-1}$+ 0 v. This means that {x$_1$, x$_2$,...,x$_{n-1}$} is a basis for all vectors orthogonal to v in R$^n$. That's why the orthogonal space has dimension n-1. After that it's just a matter of closing this set of vectors under scalar multiplication and vector addition, which isn't too bad.

  • 1
    use dollar sign to encapsulate mathjax syntax2017-07-27