2
$\begingroup$

Here I have an interesting problem on linear algebra. It looks very simple, but not so easy to solve for me.

Let $r_i, i=1,…,n$ be unit vectors in $\mathbb{R}^n$, find a unit vector $x$ to minimize $\sum \| r_i\times x \|^2$

Remark: if let $\theta$ be the angle between $r_i$ and $x$, then $\sum \| r_i\times x \|^2 = \sum \sin^2 \theta _i$. But I don't like sinusoid functions, I think they make the problem more complex especially for high dimensional cases. Is it possible to solve the problem using linear algebra or matrix analysis?

Thank you very much.

Shiyu

  • 1
    What is the cross product in $\mathbb{R}^n$? Does $x=0$ solve your problem?2011-02-16
  • 0
    The $ \times$ means cross product. See http://en.wikipedia.org/wiki/Cross_product.2011-02-16
  • 0
    I forgot to add a constraint that $ x $ is a unit vector.2011-02-16
  • 1
    @Thomas: @Theo: OP has now called for $x$ to be a unit vector, so $x=0$ won't work, probably because of your comments. I think the best way to view the request is $\sum \| r_i\times x \|^2 = \sum \sin^2 \theta _i=\sum 1-\cos^2 \theta_i=\sum 1-(r_i\cdot x)^2$2011-02-16
  • 0
    @Shiyu: but the cross product is difficult to define in dimensions other than 3. This is essentially because a 3x3 antisymmetric matrix has 3 independent components, but a 4x4 has 6, not 4. The general case is n(n-1)/2. That is why I went to the dot product, which is well defined.2011-02-16
  • 0
    @Ross: thanks, the two problems are equivalent.2011-02-16
  • 0
    Thanks @Ross: The edit happend just before I left my comment, that's why I removed it right afterwards. I think your interpretation sounds correct, but I think it shouldn't be phrased in terms of the cross product. @Shiyu: I don't see how they should be equivalent even if interpreted in terms of the generalization alluded to by Ross.2011-02-16
  • 0
    @Calle @Thomas: do you mean the cross product will have no geometric meaning when $n>3$? Ross gives an equivalent problem using inner product. Maybe it is better to use inner product to describe the problem.2011-02-16
  • 0
    @Shiyu: lower on the Wikipedia page you cite there is a short bit on Cross product as an exterior product, which links to exterior product. That will give you some idea what goes on in other dimensions. It does have a meaning, but it is an n-2 dimensional object.2011-02-16
  • 0
    @Theo: Maybe it is better to show the most original problem. In fact, I am trying to find the minimum eigenvalue of the matrix $A=\sum (Id-r_i r^T_i)$. As $\lambda_{min} = min x^T A x$ for arbitrary unit vector $x$, I am trying to minimize $x^T A x$ to get the minimum eigenvalue. In addition, I think it is straight forward to verify $\sum \| r_i\times x \|^2 = \sum 1-(r_i\cdot x)^2$2011-02-16
  • 0
    Do you have any condition on $r_i$, $i = 1, \dots, n$? Are they linearly independent? Orthogonal?2011-02-16
  • 0
    @Calle: No. $ r_i $ can be arbitrary unit vectors with $\|r_i\|^2=1$. It seems that the minimum eigenvalue of $A$ can not be determined analytically if $r_i$ are arbitrary even though the matrix $A$ has a very special form, right?2011-02-16
  • 0
    @Shiyu: Yeah. The eigenvalues of $I - r_ir_i^T$ are easy to determine, but the whole sum seems trickier.2011-02-16
  • 0
    @Calle: Yes, you are right. $Id-r_i r_i^T$ in fact is an orthogonal projection matrix with rank 2. Its eigenvalues are 1, 1 and 0. But the sum of some orthogonal projection matrices is a positive definite matrix.2011-02-16

1 Answers 1

0

You want to maximize $\sum_i (r_i \cdot x)^2$ over unit vectors $x$. To get rid of the constraint, this is $$\frac{\sum_i (r_i \cdot x)^2}{|x|^2}=\frac{\sum_i (r_i \cdot x)^2}{\sum_j x_j^2}=\frac{\sum_{ij} (r_{ij} x_j)^2}{\sum_j x_j^2}$$ where $r_{ij}$ is the $j^{th}$ component of $r_i$. Now you can differentiate with respect to $x_j$ and set to zero without any trig functions getting in the way.

  • 0
    thanks. In fact, we can directly calculate the derivative of ${\sum (r^T_i x )^2}/{x^T x}$ with respect to the vector $x$. Let the derivative be zero and we have $x^T x\sum (r^T_i x )r_i-\sum(r^T_ix)^2x=0$.2011-02-16