1
$\begingroup$

I have a several 3D vectors $X_{i}$ which lay approximately in a plane. Now I need to find a single vector, which is normal (as much as possible) to all of them.

For two vectors, I can use a cross product to obtain such vector.

For multiple vectors, I found this formula:

$\left( \sum_{i=1}^{n} X_{i}X_{i}^{\textbf{T}} \right)u=0$

Here $u$ is the desired vector (null vector of a covariance matrix).

Question: What is the reason behind this formula?

2 Answers 2

2

The formula you found will have no solution unless all the $X_i$ are actually in the same plane! Instead, you should find the $u$ that minimizes $u^T\big(\sum\limits_{i=1}^n X_i X_i^T\big)u$. This is the least-squares solution to requiring $X_i^Tu = 0$ for all $X_i$ (I can elaborate on this if you want). The desired $u$ is the eigenvector corresponding to the smallest eigenvalue of $\sum\limits_{i=1}^n X_i X_i^T$.

Define the $3\times n$ matrix $X$ whose columns are the vectors $X_i$. Then $\sum\limits_{i=1}^n X_iX_i^T=XX^T$, so we're trying to minimize $u^TXX^Tu=\lVert X^Tu\rVert^2$ for a unit vector $u\in\mathbb R^3$. The solution is the eigenvector of $XX^T$ corresponding to the smallest eigenvalue; equivalently, the right singular vector of $X^T$ ($=$ the left singular vector of $X$) corresponding to the smallest singular value. Finding singular vectors of $X$ is more numerically stable than computing $XX^T$ and finding its eigenvectors, so using the SVD is recommended.

  • 0
    Thanks. Both methods work as you said, so I have chosen the one you provided as more numerically stable. Just if you are curious: I have used the method to automatically straighten panoramic image mosaics like [this one](http://libca.cz/pub/libor/straightening.jpg).2013-01-30
0

You can think of the eigen-vectors of the covariance matrix as spanning the vector field, and therefore you choose the null vector which is orthogonal to the said field.

The method is called Principal Component Analysis (PCA) and wikipedia has a bit more about the derivation.