4
$\begingroup$

I am studying the book Linear Algebra from Hoffman and Kunze. The authors make the following comment on page 281.

Although it is of limited practical use for computations, it is interesting to note that the Gram-Schmidt process may also be used to test for linear dependence.

I have two question on that comment:

1) Why should I study the Gram-Schmidt orthogonalization process?

2) Is there an example where the Gram-Schmidt orthogonalization process makes easier to prove that a set of vectors is linearly dependent instead of use another method? I've never proved that a subset of vectors was linearly independent by using the Gram-Schmidt orthogonalization process.

Maybe I did not understand what they are saying.

  • 2
    I suppose the point being made by the authors is that if you are in an $n$-dimensional inner product space, and you apply Gram-Schmidt to a set of $n$ vectors which is not linearly independent to start with, you will inevitably obtain the zero vector at some stage in the Gram-Schmidt process, instead of returning an othonormal basis.2012-02-12

2 Answers 2

4

The Gram-Schmidt orthogonalization process is a great thing to learn about because the idea behind it shows up again and again. There are many practical algorithms that use what is essentially a Gram-Schmidt procedure. For example, suppose you have a set $\{v_1, \dots, v_n\}$ of linearly independent vectors (or functions), and you want to approximate another vector (or function) $w$ as a linear combination of the vectors in your set. This can be done by linear regression -- i.e. project $w$ onto the space spanned by the set. An alternative method, which is useful when you have a huge collection of functions in your set, is called "matching pursuit." Here you project $w$ onto the space spanned by the vector $v_i$ that is most correlated with $w$, subtract off that component, then project what remains onto the next most correlated vector from your set, subtract, etc. This process of projecting-subtracting, projecting-subtracting,... is just like Gram-Schmidt, and it is the basic principal behind many practical algorithms for time-frequency decompositions, time-scale (wavelet) decompositions, etc. In other words, there are many "Gram-Schmidt-like" procedures, so you would do well to learn the original!

1

The main thing I have seen Gram-Schmidt used for it just to guarantee existence of an orthonormal basis. This makes the basis very nice.

Look through the chapter on inner product spaces, almost all the proofs start off with, let $\alpha_1, ..., \alpha_n$ be an orthonormal basis. This makes the basis really nice because if $(|)$ is your inner product. $ (\alpha_i| \alpha_j) = \delta_{ij} = \begin{cases} 0 & \text{ if } i \neq j\\ 1 & \text{ if } i = j \end{cases}$

  • 0
    One _does not need_ Gram-Schmidt for the existence of orthonormal basis. Just choose any nonzero vector, normalize it, and limit your further choices to the orthogonal complement of the span of the chosen vector as you continue choosing basis vectors, until reaching the dimension of the space.2015-03-29