6
$\begingroup$

I tried to prove that a real antisymmetric matrix can be taken by an orthogonal tranformation to a form:

antisymmetric matrix transofrmation

where the eigenvalues are $\pm i\lambda_1, \pm i\lambda_2 ... $

which is a statement I saw on wikipedia in http://en.wikipedia.org/wiki/Antisymmetric_matrix

I also know an antisymmetric matrix can be diagonalized by a unitary transformation, and I found a unitary transformation taking the diagonal matrix to the required form.

So by composing the two transformations (diagonalization, then taking the diagonal matrix to the required form), I'll get a unitary transformation taking the real antisymmetric matrix to another real matrix.

My question is if this transformation must be a real matrix? if so I can deduce that the unitary transformation is in fact an orthogonal transformation.

So is this true?

Is a unitary transformation taking a real matrix to another real matrix necessarily an orthogonal transformation?

EDIT: After receiving in the comment here a counterexample, I'm adding:

Alternatively, if it is not necessarily orthogonal, does there necessarily exist an orthogonal transformation taking the two matrices to each other?

  • 0
    After the edit, the answer is yes: *unitarily equivalent real matrices are orthogonally equivalent.* But I struggle to find a reference.2012-08-02

1 Answers 1

2

Yes. Quoting Halmos's Linear algebra problem book (Solution 160).

“If $A$ and $B$ are real, $U$ is unitary, and $U^*AU = B$, then there exists a real orthogonal $V$ such that $V^*AV = B$.

A surprisingly important tool in the proof is the observation that the unitary equivalence of $A$ and $B$ via $U$ implies the same result for $A^*$ and $B^*$. Indeed, the adjoint of the assumed equation is $U^*A^*U = B^*$.

Write $U$ in terms of its real and imaginary parts $U = E + i F$. It follows from $AU = UB$ that $AE = EB$ and $AF = FB$, and hence that $A(E+\lambda F) = (E+\lambda F)B$ for every scalar $\lambda$. If $\lambda$ is real and different from a finite number of troublesome scalars (the ones for which $\det(E+\lambda F) = 0$), the real matrix $S = E + \lambda F$ is invertible, and, of course, has the property that $AS=SB$.

Proceed in the same way from $U^*A^*U = B^*$: deduce that $A^*(E+\lambda F) = (E+\lambda F)B^*$ for all $\lambda$, and, in particular, for the ones for which $E+\lambda F$ is invertible, and infer that $A^*S = SB^*$ (and hence that $S^*A = BS^*$).

Let $S =VP$ be the polar decomposition of $S$ (that theorem works just as well in the real case as in the complex case, so that $V$ and $P$ are real.) Since $BP^2 = BS^*S = S^*AS = S^*SB = P^2B,$ so that $P^2$ commutes with $B$, it follows that $P$ commutes with $B$. Since $AVP = AS = SB = VPB = VBP$ and $P$ is invertible, it follows that $AV=VB$, and the proof is complete.”

Needless to say, that isn't the shortest path to prove the reduction of antisymmetric matrices...

  • 0
    thank you, only statement I'm not sure about: $P^2$ commutes with $B$ then $P$ commutes with $B$, is this a general statement or just true for $P$?2012-08-02