0
$\begingroup$

Let $x,x' \in \mathbb{R}^n$. If $|x| = |x'|$ there exists an orthogonal matrix $A$ such that $x' = Ax$ ($|\cdot|$ denotes the usual euclidean norm).

I think it is pictorially clear. However, I do not quite see the existence in terms of a explicit matrix. Is there an easy way to find this matrix $A$? I mean in $2$ dimensions it is pretty clear. Also if the construction is too complicated, is there a nice theorem in linear algebra which implies this?

  • 1
    How about a rotation matrix?2017-01-28

2 Answers 2

1

Because of the assumption $|x|=|x'|$, we may assume without loss of generality (other than the trivial case $x=x'=0$) that $|x|=|x'|=1$. Now extend $x$ to an orthonormal basis $e_1,\ldots,e_n$, with $e_1=x$. And extend $x'$ to an orthonormal basis $f_1,\ldots,f_n$, with $f_1=x'$. Now the matrix defined by $Ae_j=f_j$, $j=1,\ldots,n$ is orthogonal and satifies $Ax=x'$.

  • 0
    You need (as I did) to special-case the (trivial) case where $\| x \| = 0$. Also:I think you mean "fits the bill", unless you want to suggest that $A$ is paying for your beer. :)2017-01-28
1

Sure -- Gram Schmidt orthogonalization and the change of basis theorem.

Reduction: for any vector $v$ of norm $1$, there's a matrix $M_v$ taking $e_1$ to $v$. I'll prove this in a moment.

Once you accept that, the general theorem is easy: to take $u$ to $w$, you let $h = u / \| u \|$ and $k = w / \| w \|$, and use the matrix $M_k \cdot (M_h)^{-1}$. (For $\| u \| = 0$, any matrix will work.)

To prove the reduced claim:

Apply gram-schmidt to the ordered set $$v, e_1, e_2, \ldots, e_n$$ discarding any vector that becomes zero after projection on the prior subspace; the result is a list of vectors $$ v, a_2, \ldots, a_n $$ that are orthonormal. The matrix $M$ whose columns are these vectors in order is a rotation taking $e_1$ to $v$.

Notice that this implies something stronger than your claim: not only is there an orthogonal matrix taking $v$ to $w$, but there's one of determinant $+1$ (i.e., a rotation).