Perhaps some context will be valuable. Since $R^T = R^{-1}$ for any rotation matrix, it is equivalent to ask for matrices satisfying $AR = RA$. These are precisely the matrices commuting with any rotation matrix. There are several ways to say this. For example, we can talk about the centralizer of the rotation matrices.
Argument 1: Here another way to say "centralizer" is as follows. Thinking of the rotation matrices as describing a representation of the circle group $S^1 = \{ z \in \mathbb{C} : |z| = 1 \}$ on $\mathbb{R}^2$, we are also asking about the intertwiners of this representation.
Since $S^1$ is abelian, every element commutes with every other element, so any other rotation matrix has this property. The collection of linear combinations of such things is an algebra in $\text{End}(\mathbb{R}^2)$ isomorphic to the complex numbers.
Now, this representation $\mathbb{R}^2$ is irreducible. Schur's lemma then implies that the collection $\text{End}_{S^1}(\mathbb{R}^2)$ of intertwiners is a division algebra over $\mathbb{R}$. This division algebra contains the copy of $\mathbb{C}$ above and must also commute with every element in it, so it is in fact a division algebra over $\mathbb{C}$, and since it is finite-dimensional the only such division algebra is $\mathbb{C}$ itself.
Thus the matrices with this property are precisely the linear combinations of rotation matrices.
Argument 2: Over $\mathbb{C}$, the rotation matrices all share a common set of eigenvectors, namely $\left[ \begin{array}{c} 1 \\\ -i \end{array} \right]$ and $\left[ \begin{array}{c} 1 \\\ i \end{array} \right]$ with eigenvalues $e^{i \theta}, e^{-i \theta}$. Since these eigenvalues are different in general, any matrix commuting with all rotation matrices must share these eigenvectors, hence must be diagonal in the corresponding basis. But linear combinations of rotation matrices (in fact it suffices to take the identity and $90^{\circ}$ rotation) already span all such matrices (over $\mathbb{C}$, and moreover real linear combinations span the corresponding real matrices).
Argument 3: We use the fact that
$\left[ \begin{array}{cc} \cos \theta & - \sin \theta \\\ \sin \theta & \cos \theta \end{array} \right] = (\cos \theta) I + (\sin \theta) J$
where $J = \left[ \begin{array}{cc} 0 & -1 \\\ 1 & 0 \end{array} \right]$. It follows that a matrix commutes with every rotation matrix if and only if it commutes with $J$. Explicitly writing out what this condition means, we get that these are precisely the matrices of the form
$\left[ \begin{array}{cc} a & -b \\\ b & a \end{array} \right] = a + b J.$
Exercise: More generally, let $M$ be a matrix with distinct eigenvalues. Show that its centralizer $\{ A : AM = MA \}$ is spanned by $1, M, M^2, M^3, ...$ and find an example where $M$ does not have distinct eigenvalues and where the centralizer contains other elements.