1
$\begingroup$

Let $A \in \mathbb{O}(n)$. Is there a closed-form expression for the problem: $$\arg \min_X \ \{\lVert X-A \rVert_F: X \in \mathbb{SO}(n) \}.$$

Simply speaking, I am looking to approximate a general orthogonal matrix by a rotation matrix.

A solution that comes to my mind is the following: Compute determinant of $A$ (which will be $\pm 1$). If its $1$, nothing to do. Else, pick one of the columns of $A$ and multiply it by $-1$.

Looking forward to your opinion.

  • 3
    This is a strange 'approximation' to want to make. The improper rotations with determinant -1 are, in a certain sense, very far away from the proper rotations in that they're in disconnected pieces of the group. (You have to perform a reflection, a discrete operation, to get there). What do you want to use this for?2017-01-24

2 Answers 2

2

If $A\in SO(n)$, the minimiser is obviously $X=A$.

Suppose $A\in O(n)$ and $\det A=-1$. Then $\|X-A\|_F^2=\|A^T X-I\|_F^2=2n-2\operatorname{tr}(A^T X)$. For any skew-symmetric matrix $K$, define $$ f(t)=2n-2\operatorname{tr}(A^T Xe^{tK}),\quad t\in\mathbb R. $$ Since $Xe^{tK}$ also belongs to $SO(n)$, if $X$ is a minimiser, we must have $f'(0)=-2\operatorname{tr}(A^TXK)=0$ for every skew-symmetric matrix $K$. Hence $A^TX$ is symmetric.

Now $A^TX$ is a symmetric orthogonal matrix with determinant $-1$. So, an odd number of its eigenvalues are equal to $-1$ and the rest are equal to $1$. Therefore, $$ \|A^TX-I\|_F=\sqrt{\sum_i\left(\lambda_i(A^TX)-1\right)^2} $$ is minimised when exactly one eigenvalue of $A^TX$ is equal to $-1$ and the others are equal to $1$. In other words, $X=AV$ for some orthogonal matrix $V$ whose spectrum is $\{1,\ldots,1,-1\}$.

  • 0
    Thanks for the explanation. So my guess (of changing the sign of one the columns) is one of the solutions where $V=\text{diag}(\{1,\ldots,-1,\ldots,1\})$ with the $-1$ sitting at the right position.2017-01-25
  • 0
    @kayencee Yes. The position of $-1$ on the diagonal actually doesn't matter. Negate any column of $A$ and you get the same minimum value (which is 2).2017-01-25
  • 0
    Neat!${}{}{}{}{}$2017-01-25
  • 0
    @user1551: Would it be possible to extend your argument to the problem: $$\arg \min_X \ \{ \lVert X−A \rVert_F : X \in \mathbb{SO}(n) \}$$ where $A$ is some $n \times n$ matrix. Is it possible to use the polar decomposition $A=RQ$ where $R$ is symmetric positive semidefinite and $Q$ is orthogonal.2017-02-27
  • 1
    @kayencee Yes, and the result is that if $A=USV^T$ is a singular value decomposition, then a global minimiser of $\|X-A\|_F$ over $SO(n)$ is given by $X=U\operatorname{diag}(1,\ldots,1,\det(UV^T))V^T$. In particular, if $\det(A)\ge0$ and $A=PQ$ is a polar decomposition, then $X=Q$ is a global minimiser.2017-02-27
  • 0
    @user1551: Is there some reference for these results? Thanks.2017-02-28
  • 0
    @kayencee This is just basic calculus + linear algebra. Some books on linear algebra might have explicitly formulated the result as an exercise, but I'm not aware of any of them. You may try the usual suspects, like Horn and Johnson's *Matrix Analysis* or Golub and van Loan's *Matrix Computations*.2017-03-01
0

Not an answer, but too long to format in a comment. Note that in the $2 \times 2$ case, all approximations are equally bad. For instance, take $$ A = \pmatrix{-1&0\\0&1} $$ Every element of $SO$ is a rotation by an angle $\theta$. We find $$ \|A - X_\theta \|^2 = (\cos \theta - 1)^2 + (\cos \theta + 1)^2 + 2 \sin^2 \theta =\\ 2 + 2 \cos^2\theta + 2\sin^2 \theta = 4 $$