3
$\begingroup$

In 3D, you can take any pure rotation matrix and find an axis-angle representation of the same transformation (although not necessarily unique). From that representation, you could create a new matrix that represents a fractional part of the original transformation by taking a fraction of the original angle, keeping the original axis, and creating a new matrix.

In high-dimensional space, axis-angle doesn't make sense since it seems that rotation actually happens within a plane and not necessarily around an axis per se. Is it possible to find a rotation matrix that rotates a space by some fraction of the angle by which some other rotation matrix rotates without first finding the plane and angle of rotation?

  • 0
    It looks hard. Expressing $\cos\frac{\theta}{n}$ and $\sin\frac{\theta}{n}$ in terms of $\cos\,\theta$ and $\sin\,\theta$ makes for complicated expressions...2011-05-09
  • 0
    @J.M. Hmm... true enough.2011-05-09

2 Answers 2

5

Essentially you are asking to raise the rotation matrix to an arbitrary power. To do this you can use the fact that $X^a=\exp(a \log X)$ for any matrix $X$. To compute the matrix logarithm we use

$$\log X = \log (I - (I-X)) = -\sum_{n=1}^\infty \frac{(I-X)^n}{n}$$

Now if you can diagonalize $I-X$ (perhaps there is a proof that this is always possible for $X$ a rotation matrix?) to give $I-X=SDS^{-1}$ then you have

$$\log X = -S \left(\sum_{n=1}^\infty \frac{D^n}{n}\right) S^{-1}$$

which is fast to compute (again you'd need a proof that this converges when $X$ is a rotation matrix). Now you compute the matrix exponential in the same way. Letting $\log X=\hat{S}\hat{D}\hat{S}^{-1}$ for $\hat{D}$ diagonal,

$$X^a = \exp(a\log X) = \hat{S} \left(\sum_{n=0}^\infty \frac{a^n \hat{D}^n}{n!}\right) \hat{S}^{-1}$$

which again is fast to compute.


Thinking off the top of my head now, it seems that since all rotations are in a plane, the eigenvalues of a rotation X in $\mathbb{R}^n$ must be $e^{\pm \mathrm{i}\theta}$ for some $\theta$ (once each) and $1$ ($n - 2$ times). Therefore the matrix $I-X$ has eigenvalues $1-e^{\pm\mathrm{i}\theta}$ and $0$ ($n - 2$ times), so its diagonalisation D has a particularly simple form.

  • 0
    Since $\mathbf I$ and $\mathbf X$ commute and are both diagonalizable, $\mathbf I-\mathbf X$ ought to be diagonalizable. On the other hand, an eigen-approach (or even just taking matrix functions) seems like a mosquito nuking solution for me...2011-05-09
  • 0
    A problem is, that sometimes the mercatorseries for your first solution converges slowly. I use that matrix-logarithm method with the convergence acceleration using Eulersummation for the mercatorseries. This seems to be a nice and stable method for all occurences of rotation-matrices. (However, the diagonalization may even be better because then you can use the scalar logarithms of the diagonal-entries in D only...)2011-05-09
  • 0
    Hmm, with an 3x3-rotationmatrix I find complex eigenvalues in D. This is not well configured for fractional steps of rotation because you have then to define fractional roots/log of complex numbers. The matrix-log however uses only real values. See the example in the answer below2011-05-09
  • 0
    @Gottfried Helms: But surely the ambiguity in fractional roots of complex numbers is necessary, to reflect the ambiguity in fractional rotations? e.g. half of a rotation by $\pi/2$ could be a rotation by $\pi/4$ or by $5\pi/4$.2011-05-09
  • 0
    @Rahul: Hmm, you may be correct; I didn't think of that down to the ground...2011-05-09
  • 0
    @J.M. I agree that this is a bit heavy handed. I was assuming that the matrix was specified as a collection of $n^2$ numbers. I don't see what else can be done unless you're willing to give the matrix in parameterized form -- and if you're going to give it in parameterized form, you may as well take $X=\exp(\mathbf{i}\theta)$ for $\mathbf{i}$ a 2-vector spanning the plane of rotation (as in geometric algebra) in which case a fraction $a$ of the rotation is trivially $\exp(\mathbf{i}a\theta)$.2011-05-09
1

This is not an answer but an explanation to my comment above

I begin with an integer 3x3-matrix M. Example $ \qquad \small M= \begin{array} {rrr} 75 & 46 & 170 \\ 113 & 193 & 43 \\ 23 & 38 & 90 \end{array} $

Then I get the rotationmatrix $T$ which rotates $M$ to lower triangular shape by column-rotation (M T = lower triangular) :

$ \qquad \small T = \begin{array} {rrr} 0.391811884067 & 0.332906301267 & -0.857704402508 \\ 0.240311288894 & 0.862849514418 & 0.444681009149 \\ 0.888106937218 & -0.380347354460 & 0.258073551572 \end{array} $

The mercatorseries for the log of $T$ converges sufficiently fast, I get

$ \qquad \small L= \begin{array} {rr} 0 & 0.0628202506390 & -1.18442995060 \\ -0.0628202506390 & 0 & 0.559733048879 \\ 1.18442995060 & -0.559733048879 & 0 \end{array}$

using 200 terms and all displayed digits correct.

Then the $0.2$-step of that rotation is $T02=\exp(0.2*L) $

$ \qquad \small T02 = \begin{array} {rrr} 0.972024543477 & 0.0256039075816 & -0.233479606807 \\ 0.000762973571737 & 0.993691347552 & 0.112146884361 \\ 0.234878063577 & -0.109187662843 & 0.965872843356 \end{array}$

Using the mateigen-procedure in Pari/GP I get complex-valued matrices of eigenvectors and the following complex-valued diagonal-matrix $D$:

$ \qquad \small D= \begin{array} {rrr} 1.00000000000 & . & . \\ . & 0.256367475028-0.966579390297 î & . \\ . & . & 0.256367475028+0.966579390297 î \end{array}$

(which finally provides the same result $T02$, if I ask Pari/GP for the $0.2$'th-power of the scalar diagonal-entries in $D$, however with spurious imaginary entries.)

The sequence of 5 steps of the rotation ($0,0.2,0.4,0.6,0.8,1.0$) are then

$ \qquad \small \begin{array} {rrr} 75 & 46 & 170 \\ 113 & 193 & 43 \\ 23 & 38 & 90 \\ - & - & - \\ 112.866208353 & 29.0681923728 & 151.846169541 \\ 120.085784046 & 189.980602132 & 36.7936853768 \\ 43.5245832176 & 28.5222714255 & 85.8201065512 \\ - & - & - \\ 145.396237174 & 15.1948988538 & 123.572040788 \\ 125.513309167 & 187.839329344 & 28.8061724883 \\ 62.4859853141 & 20.0862368751 & 75.9278916310 \\ - & - & - \\ 170.364666021 & 5.32920901101 & 87.1118826728 \\ 128.911271501 & 186.722668818 & 19.5838972005 \\ 78.5870327743 & 13.2690161490 & 61.0000941817 \\ - & - & - \\ 186.063373076 & 0.146047172654 & 44.9599807576 \\ 130.047212143 & 186.707012713 & 9.75776703751 \\ 90.7262325637 & 8.53698394520 & 42.0579437196 \\ - & - & - \\ 191.418389921 & 0 & 0 \\ 128.843419956 & 187.793432084 & 0 \\ 98.0731266611 & 6.21386457561 & 20.3972967315 \end{array} $