4
$\begingroup$

I know that for a vector space $\mathbb R^n$ one can use the Gram-Schmidt process to construct its basis. But what if the vector space is over some arbitrary field? I am thinking of the following:

  1. Pick an arbitrary vector in $V$, label as $v_1$
  2. Pick another arbitrary vector in $V$. From this deduct the component in $v_1$. If this gives the zero vector then do it again with another arbitrary vector, otherwise take this as $v_2$ .
  3. Repeat the above until we have found $n$ linearly independent vectors. (Given that $\dim V=n < \infty$); otherwise, we go on forever.

(Basically Gram-Schmidt.)

This doesn't seem like a particularly efficient algorithm especially for large $n$, are there any better suggestions? Also, I am not sure that my steps are necessarily valid. Is the scalar product -- that obtains the component of an arbitrary vector in the direction of a $v_i$ already in the set -- defined for vector spaces over arbitrary fields?

Thanks.

  • 1
    @Hans: The two are related in that "usually" the easiest way to tell whether the vectors are linearly independent is to make them orthonormal.2011-09-10

2 Answers 2

2

You don't need Gram-Schmidt. Start with a generating set or keep adding vectors to a set and use Gaussian elimination to remove linear dependences.

0

Choose $x_1\neq0$ which is l.i. then if $x_1$ generates V it is a basis and you're done.

If it is not a basis there is a $x_2\in V\setminus $ and $x_1,x_2$ are l.i, if $x_1,x_2$ generates you got your basis,....

... ...

if not, choose a $x_{n+1}\in V\setminus $ , ${x_1,...,x_{n+1}}$ are l.i. then if ${x_1,...,x_{n+1}}$ generate you've got your basis.

If the dimension is finite the algoritm must stop.

With I mean the generated subspace by S.