1
$\begingroup$

Suppose we are given the euclidean space $\mathbb R^{n+m}$ with the decompositin $\mathbb R^n = V \oplus W$, which we however do not expect to be orthogonal.

Let us describe the matrix $P$ that projects onto $V$ along $W$. Let $v_1, \dots, v_n$ be a basis for $V$ and let $w_1, \dots w_m$ be a basis for $W$.

Let $A$ be the matrix that maps $e_1, \dots, e_{n}, e_{n+1}, \dots, e_{m+n}$ to $v_1, \dots, v_n, w_1, \dots, w_m$. Then $A^{-t} A^{-1}$ is the matrix of the scalar product with respect to which the $v_1, \dots, v_n, w_1, \dots, w_m$ form an orthogonal basis.

We can now define the projectors

$P_j(u) = u - \dfrac{ w^t_j A^{-t} A^{-1} u }{w^t_j A^{-t} A^{-1} w_j } w_j$

for $1 \leq j \leq m$. Then the projection onto $V$ along $W$ is given by

$P = P_1 \circ \dots \circ P_j$

It is not clear, whether this is a good scheme to numerically implement. We need $j+2$ matrix-matrix, $j$ matrix-vector multiplications, $j$ scalar-products and $1$ matrix inversion for the above method. Still, $n$ is assumed to be small, and the matrix $A$ is assumed to be well-conditioned (i.e. not arbitrarly bad).

I wonder whether we can do better. Is there a canonical way in numerical linear algebra to perform this task?

  • 0
    Actually, if I don't err, you do not even need to use Gram-Schmidth for $W$, as the new scalar product as constructed in the original post has been constructed to serve that way. Of course, we can normalize $w_i$ in advance with respect to that product, but that does not improve that substantially. (I think)2011-12-11

0 Answers 0