1
$\begingroup$

Suppose we are given the euclidean space $\mathbb R^{n+m}$ with the decompositin $\mathbb R^n = V \oplus W$, which we however do not expect to be orthogonal.

Let us describe the matrix $P$ that projects onto $V$ along $W$. Let $v_1, \dots, v_n$ be a basis for $V$ and let $w_1, \dots w_m$ be a basis for $W$.

Let $A$ be the matrix that maps $e_1, \dots, e_{n}, e_{n+1}, \dots, e_{m+n}$ to $v_1, \dots, v_n, w_1, \dots, w_m$. Then $A^{-t} A^{-1}$ is the matrix of the scalar product with respect to which the $v_1, \dots, v_n, w_1, \dots, w_m$ form an orthogonal basis.

We can now define the projectors

$P_j(u) = u - \dfrac{ w^t_j A^{-t} A^{-1} u }{w^t_j A^{-t} A^{-1} w_j } w_j$

for $1 \leq j \leq m$. Then the projection onto $V$ along $W$ is given by

$P = P_1 \circ \dots \circ P_j$

It is not clear, whether this is a good scheme to numerically implement. We need $j+2$ matrix-matrix, $j$ matrix-vector multiplications, $j$ scalar-products and $1$ matrix inversion for the above method. Still, $n$ is assumed to be small, and the matrix $A$ is assumed to be well-conditioned (i.e. not arbitrarly bad).

I wonder whether we can do better. Is there a canonical way in numerical linear algebra to perform this task?

  • 0
    If you have a scalar product $<,>$ for which $W$ is orthogonal to $V$ and an orthonormal basis $w_i$ for $W$ then the projection $v$ of $x$ onto $V$ is simply given by $$v = x - \sum $$ This involves only the evaluation of $m$ scalar products. Don't know whether one can do better (numerically).2011-12-11
  • 0
    I would actually use the Gram-Schmidt orthonormaization theorem... It is numerically very easy to implement. I assume that you can take the existence of an inner product for granted.2011-12-11
  • 0
    I think things got a bit mixed up here. It was my impression the original post asked how to efficiently calculate the projection. It also seemed to imply that the task of finding a scalar product which makes $V,W$ orthogonal was solved. If you then have an orthonormal basis for $W$ (which you get, of course, by Gram Schmidt) you only need the matrix representing the scalar product and can rather efficiently calculate the projection using the formula I mentioned. In particular there is no need for matrix-matrix multiplications, _once_ the scalar product is known.2011-12-11
  • 0
    Actually, if I don't err, you do not even need to use Gram-Schmidth for $W$, as the new scalar product as constructed in the original post has been constructed to serve that way. Of course, we can normalize $w_i$ in advance with respect to that product, but that does not improve that substantially. (I think)2011-12-11

0 Answers 0