3
$\begingroup$

Consider a 2D case. Let $R$ be a rotation matrix with angle $\theta$ $R = \begin{bmatrix} \cos\theta & -\sin\theta\\ \sin\theta & \cos\theta \end{bmatrix}.$

Is it possible for a matrix $A$ to satisfy the following identity for any $\theta$ $A = R A R^T?$

  • 0
    The identity matrix is always a good one to try in these situations. This is, of course, covered by @OlivierBégassat's answer.2012-07-21

3 Answers 3

9

Perhaps some context will be valuable. Since $R^T = R^{-1}$ for any rotation matrix, it is equivalent to ask for matrices satisfying $AR = RA$. These are precisely the matrices commuting with any rotation matrix. There are several ways to say this. For example, we can talk about the centralizer of the rotation matrices.

Argument 1: Here another way to say "centralizer" is as follows. Thinking of the rotation matrices as describing a representation of the circle group $S^1 = \{ z \in \mathbb{C} : |z| = 1 \}$ on $\mathbb{R}^2$, we are also asking about the intertwiners of this representation.

Since $S^1$ is abelian, every element commutes with every other element, so any other rotation matrix has this property. The collection of linear combinations of such things is an algebra in $\text{End}(\mathbb{R}^2)$ isomorphic to the complex numbers.

Now, this representation $\mathbb{R}^2$ is irreducible. Schur's lemma then implies that the collection $\text{End}_{S^1}(\mathbb{R}^2)$ of intertwiners is a division algebra over $\mathbb{R}$. This division algebra contains the copy of $\mathbb{C}$ above and must also commute with every element in it, so it is in fact a division algebra over $\mathbb{C}$, and since it is finite-dimensional the only such division algebra is $\mathbb{C}$ itself.

Thus the matrices with this property are precisely the linear combinations of rotation matrices.

Argument 2: Over $\mathbb{C}$, the rotation matrices all share a common set of eigenvectors, namely $\left[ \begin{array}{c} 1 \\\ -i \end{array} \right]$ and $\left[ \begin{array}{c} 1 \\\ i \end{array} \right]$ with eigenvalues $e^{i \theta}, e^{-i \theta}$. Since these eigenvalues are different in general, any matrix commuting with all rotation matrices must share these eigenvectors, hence must be diagonal in the corresponding basis. But linear combinations of rotation matrices (in fact it suffices to take the identity and $90^{\circ}$ rotation) already span all such matrices (over $\mathbb{C}$, and moreover real linear combinations span the corresponding real matrices).

Argument 3: We use the fact that

$\left[ \begin{array}{cc} \cos \theta & - \sin \theta \\\ \sin \theta & \cos \theta \end{array} \right] = (\cos \theta) I + (\sin \theta) J$

where $J = \left[ \begin{array}{cc} 0 & -1 \\\ 1 & 0 \end{array} \right]$. It follows that a matrix commutes with every rotation matrix if and only if it commutes with $J$. Explicitly writing out what this condition means, we get that these are precisely the matrices of the form

$\left[ \begin{array}{cc} a & -b \\\ b & a \end{array} \right] = a + b J.$

Exercise: More generally, let $M$ be a matrix with distinct eigenvalues. Show that its centralizer $\{ A : AM = MA \}$ is spanned by $1, M, M^2, M^3, ...$ and find an example where $M$ does not have distinct eigenvalues and where the centralizer contains other elements.

  • 2
    As usual, you have an interesting approach, or two! (+1)2012-07-21
2

For two by two matrices it is easy enough to calculate $A = R A R^T$ for a generic $A$ and see what the conditions on $A$ are. Letting $A = \left(\begin{array}{cc}a_{11}&a_{12}\\ a_{21}&a_{22}\end{array}\right)$ we find the conditions on the components of $A$ are $\begin{eqnarray*} a_{22} &=& a_{11} \\ a_{21} &=& -a_{12}. \end{eqnarray*}$ Thus, $A$ is of the form $\begin{eqnarray*} A &=& \left(\begin{array}{cc} a_{11}&a_{12}\\ -a_{12}&a_{11} \end{array}\right) \\ &=& r \left(\begin{array}{cc} \frac{x}{r} &-\frac{y}{r}\\ \frac{y}{r}&\frac{x}{r} \end{array}\right), \end{eqnarray*}$ where $x = a_{11}$, $y=-a_{12}$, and $r = \sqrt{x^2 + y^2}$. Notice that $x/r$ and $y/r$ can be interpreted as the cosine and sine, respectively, of some angle. Thus, $A$ must be some constant times a rotation matrix.

I see this is the answer given in a comment by @Olivier Bégassat and in the answer by @Qiaochu Yuan.

0

Notice that $R^T=R^{-1}$, so $A = R A R^T$ means $AR=RA$, so any matrix A share the same eigenvectors of $R$ will do.