1
$\begingroup$

let $v$ be a vector, and $\forall i \in 1..k < n : n_i$ be normal vectors of hyperplanes in the in an $R^n$ space. the problem is to compute orthogonal Projection of $v$ onto intersection of all those planes.

I'm looking for an n*n matrix (M), so that $vM = Projection(v)$

Here are two examples: $$n=3, k=2, n_1=(1,0,0), n_2=(0,1,0) \Rightarrow p(1,1,1) = (0,0,1)$$$$n=3,k=2,n_1=(1,0,-1),n_2=(0,1,-1) \Rightarrow p(0,0,1)=({1 \over 3},{1 \over 3},{1 \over 3})$$$$n=3,k=1,n_1=(0,0,1) \Rightarrow p(1,1,1) = (1,1,0)$$

  • 0
    What do you mean by image of $V$? $V$ is a vector. What does $\times$ mean?2012-12-12
  • 0
    @copper.hat removed $\times$ I just meant normal multiplication function of matrices. here is an example let $n=3$ and $V =(1,1,1)$ and $k=2$ and $N_1=(1,0,0)$ and $N_2=(0,1,0)$. in this case $image(V)=(0,0,1)$2012-12-12
  • 0
    Your example seems to suggest that by "image of $v$ over $I$" you mean the orthogonal projection of $v$ onto $I$? By the way, mathematical variables are case-sensitive. It seems that you wanted $v$ and $V$ to refer to the same variable, but they don't. Also, usually vectors would be denoted by lower-case variables.2012-12-12
  • 0
    @joriki fixed case problem. besides that was only an example, I want to solve the problem in general case, meaning $n_i$ can be any vector in $R^n$ space. here is another example: $n=3, v=(0,0,1), k=2, n_1=(1,0,-1), n_2=(0,1,-1) => image(v) = (0.57,0.57,0.57)$2012-12-12
  • 0
    You didn't answer my question, but from this new example it seems that you *don't* mean the orthogonal projection of $v$ onto the intersection of the hyperplanes (which would be $(1/3,1/3,1/3)$ in that case). That brings us back to @copper.hat's question: What *do* you mean by the image of $v$? (Please give a definition, not another example.)2012-12-12
  • 0
    @joriki it seem I was looking for the orthogonal projection after all, but I was not familiar with the term.2012-12-12
  • 0
    @Gajoo: It would make things so much less cumbersome if you'd reply to comments comprehensively, rather than one aspect at a time, leaving other aspects unresolved. Do you disagree with my assertion that the orthogonal projection in your second example would have been $(1/3,1/3,1/3)$? If so, why? If not, how is that compatible with your statement that you're looking for the orthogonal projection after all?2012-12-12
  • 0
    @joriki I thought my mistake (computing projection vector) was obvious . and it makes your results the correct one.2012-12-12
  • 0
    If you are describing hyperplanes by their normal vectors and you want then to consider vectors that lie within those hyperplanes, you must actually want those vectors that are orthogonal to the original (normal) vectors. Perhaps your question should be worded (?) as: component of a vector $v$ that is orthogonal to a set of vectors $n_i$ Such a question could easily be answered here.2012-12-12

2 Answers 2

3

Orthonormalize the normal vectors (e.g. using Gram–Schmidt), then project them out. In vector notation, projecting out a normal vector $n_i$ from a vector $v$ yields

$$ v\to v-(v\cdot n_i)n_i\;; $$

in matrix notation, this is

$$ v\to\left(I-n_in_i^\top\right)v\;; $$

and the matrix operation for projecting out all $k$ orthonormal vectors in one go is

$$ v\to\left(I-\sum_in_in_i^\top\right)v\;. $$

Note that this only works for orthonormal normal vectors $n_i$; you can't apply it directly to your second example, in which the normal vectors aren't orthonormal.

P.S.: It's been pointed out that the question uses a row vector convention. To transform the answer to that convention, simply transpose everything and toggle the transposition markers, which yields

$$ v\to v\left(I-\sum_in_i^\top n_i\right)\;. $$

  • 0
    Do you work with row vectors as convention over column vectors? Just curious from looking at your last edit.2012-12-13
  • 0
    @adam: I'm not sure what "row vectors as convention over column vectors" means. As far as I'm aware vectors are usually taken to be column vectors, and row vectors are written as transposed column vectors; that's the convention I'm using here. The edit was just because I had it wrong in the first version; I never intended to use any other convention than that.2012-12-13
  • 0
    Of course it is just a convention that we use column vectors, I happen to think of vectors as rows myself. I find myself editing after the fact many times just in order to conventionalize to what I know others expect.2012-12-13
1

If you are looking for a software solution, you may use singular value decomposition (SVD), which has been implemented in a vast amount of software libraries. See the "external links" section of the Wikipedia page on SVD.

(As the row vector convention is used in the question, in the sequel, all vectors are row vectors; column vectors are represented as vector transposes.) Let $A=(n_1^T,\,\ldots,\,n_k^T)$, i.e. $A$ includes your $k$ normal vectors as column vectors. Now, if $w$ is in the intersection of the planes normal to the $n_i$s, we have $w\cdot n_i=0$ for every $i$. In other words, $$ wA=0 $$ where "$0$" means the zero $k$-vector $(0,0,\ldots,0)$. Now perform a SVD on $A$. This means $$ A = \underbrace{U}_{n\times n} \underbrace{\begin{pmatrix} \sigma_1&&&0&\ldots&0\\ &\ddots&&\vdots&&\vdots\\ &&\sigma_r&\vdots&&\vdots\\ 0&\ldots&0&0&\ldots&0\\ \vdots&&\vdots&\vdots&&\vdots\\ 0&\ldots&0&0&\ldots&0\\ \end{pmatrix}}_{n\times k} \underbrace{V^T}_{k\times k} $$ for some $n\times n$ real orthogonal matrix $U$, some $k\times k$ real orthogonal matrix $V$ and some $\sigma_1\ge\ldots\ge\sigma_r>0$, where $r=\mathrm{rank}(A)=$ no. of linearly independent vectors among $n_1, n_2, \ldots, n_k$. (Most computer implementations of SVD will decompose $A$ in this way, but certainly you should read the relevant documentation carefully.)

Now, let the $n$ columns of $U$ be $u_1^T, \ldots, u_n^T$. Since they are orthonormal, the equation $wA=0$ implies that $w\cdot u_i=0$ whenever $i\le r$. In other words, $w$ is spanned by $u_{r+1}, u_{r+2}, \ldots, u_n$. So, the whole problem reduces to finding the orthogonal projection of a vector $v$ to the subspace spanned by $u_{r+1}, u_{r+2}, \ldots, u_n$. And the answer is $$ v\leftarrow \sum_{i=r+1}^n (v\cdot u_i)u_i, $$ or equivalently, $$ v\leftarrow v - \sum_{i=1}^r (v\cdot u_i)u_i. $$ Depending on the actual value of $r$, one of these two formulations will be more efficient than the other.

  • 0
    Why do an SVD if you can do Gram–Schmidt? I would have thought that the SVD is computationally more expensive.2012-12-13
  • 0
    @joriki You are absolutely right, but my point is about using a canned solution to tackle the problem. *If* the OP is really looking for a software solution and we tell him to use Gram-Schmidt, we must explain how to implement the *modified* Gram-Schmidt process correctly, otherwise numerical stability issues may pop up. While SVD is considerably slower than mod. G-S (something like $O(n^3)$ vs $O(nk^2)$), there are many high quality implementations around. But certainly, if the OP doesn't feel uneasy to implement a numerical algorithm, surely mod. G-S is the way to go.2012-12-13
  • 0
    I see, thanks for the explanation.2012-12-13