4
$\begingroup$

For some reason my book distinguishes the two names.

If a set is an orthogonal set, doesn't that make it immediately a basis for some subspace $W$ since all the vectors in the orthogonal set are linearly independent anyways? So why do we have two different words for the same thing?

  • 0
    In the text from which I'm teaching this quarter (I think the author is Lay), an orthogonal set is allowed to contain the zero vector, which obviously precludes the set from being a basis for anything. An orthogonal set consisting of non-zero vectors is an orthogonal basis for its span, I agree.2012-02-22
  • 0
    Are orthogonal sets always defined as subsets of some fixed vector space $V$? Because then the word "basis" would imply (at least to me) that they are a basis for $V$, not just for some subspace.2012-02-22
  • 0
    @Dylan, yeah that's the book I am using!2012-02-22
  • 0
    Can you please quote your book's definition of "orthogonal set"?2012-02-22
  • 0
    A set of vectors $\left {u_1,...,u_p} \right .$ in $\mathbb{R}^n$ is said to be orthogonal set if each pair of distinct vectors from the set is orthogonal, that is, if $u_i \cdot u_j = 0$ whenever i is not equal to j2012-02-23
  • 0
    @mim: By convention, if we say "basis" without specifying basis for *what*, we mean the basis of the whole space of discourse.2012-02-23

4 Answers 4

6

When you say orthogonal basis you mean that the set is a basis for the whole given space. Every orthogonal set is a basis for some subset of the space, but not necessarily for the whole space.

  • 0
    How could that be? Could you give me an example? I can't visualize how that is possible2012-02-22
  • 1
    Take your favorite orthogonal basis for your favorite vector space. Now, throw away one of the vectors in the basis. What's left is still a set of orthogonal vectors, but it's no longer a basis for your vector space. But maybe your definition of orthogonal set has more to it than just "set of orthogonal vectors"?2012-02-22
  • 0
    OKay, I took $\mathbb{R}^2$ and $\hat{i}$ and $\hat{j}$, if I throw one away, then it wouldn't be a basis and one vector can't be an orthogonal set now can it? So I don't understand your argument2012-02-22
  • 1
    Actually, one vector is an orthogonal set.2012-02-22
  • 1
    $\{(1,0,0) , (0,1,0)\}$ is an orthogonal set of vectors in $\mathbb{R}^3$, but it is not an orthogonal basis of $\mathbb{R}^3$.2012-02-22
  • 0
    Actually, the empty set is also an orthogonal set.2012-02-22
  • 0
    @Jeff, can I generalize like for an orthogonal set $S = {u_1, ..., u_p}$ for $\mathbb{R}^n$ and it is a basis iff n = p, but it is not a basis if 1 < p < n?2012-02-23
  • 0
    Except zero vector. It isn't a basis for any subset. And $u_i$ shouldn't be zero for all $i$.2012-02-23
  • 0
    But am I right though? Because my book justifies by stating it is a basis because the set contains linearly independent vectors2012-02-23
  • 1
    If you have an orthogonal set in ${\bf R}^n$, then it's not a basis for ${\bf R}^n$ if it has fewer than $n$ members, and it's not a basis for ${\bf R}^n$ if it contains the zero vector, and it is a basis for ${\bf R}^n$ if it has $n$ members and doesn't contain the zero vector.2012-02-23
2

The reason for the different terms is the same as the reason for the different terms "linearly independent set" and "basis".

Every linearly independent set is a basis for the subspace it spans. But when working in a larger space "basis" means "maximal linearly independent set" (not just spanning a subspace but spanning the whole thing).

An orthogonal set (without the zero vector) is automatically linearly independent. So we have "orthogonal sets" and then maximal ones are "orthogonal bases".

Note: In the end we're essentially just tacking on the adjective "orthogonal". We don't keep the words "linearly independent" in "orthogonal linearly independent set" because they're redundant.

1
  1. These two concepts are totally different.
  2. For "orthogonal set"$M$, we only have Bessel's inequality.
  3. But, if $M$ is orthogonal basis, then we get the Parseval's theorem. The key point is the completeness of this set M in your space. For example, in finite dimensional space $\mathcal{R}^3$, $\{i,j\}$ is an orthnormal set, but not an orthonormal basis. A common orthonormal basis is $\{i,j,k\}$.