4
$\begingroup$

An interesting statement without proof in my textbook is that, $\forall A\in M_n(\mathbb{C})$, there exist unitary matrices $U,V\in U_n(\mathbb{C})$ such that $U^{-1}AV$ is diagonal.

After a few experiments I was shocked that this is true. Even if $A$ is not diagonalizable, you can still find some $U,V$ to make it diagonal.

But is there anyway I can prove it?

  • 0
    Following up on "Long"'s comment: the _real_ case of this, without the assumption that $A$ is square, is the singular-value decomposition. The entries in the diagonal matrix are the singular values. The columns of the orthogonal matrices $U$ and $V$ are the singular vectors. These seem to have lots of applications in applied mathematics.2012-09-19

2 Answers 2

4

It suffices to show that for some unitary matrix $W$, $AW$ is normal, i.e. $(AW)^*(AW)=(AW)(AW)^*$ where $*$ denotes conjugate-transpose. By the spectral theorem (see the linked wikipedia article), we can then write $D=U^{-1}A(WU)$ for some unitary $U$ and diagonal $D$.

The condition $(AW)^*(AW)=(AW)(AW)^*$ simplifies to $W^*A^*AW=AA^*$ since $WW^*=I$. Since $A^*A$ and $AA^*$ are Hermitian, by the spectral theorem they can be diagonalized by unitary matrices, so we have $X^*A^*AX=D$ and $Y^*AA^*Y=D'$ for some unitary $X,Y$ and diagonal $D,D'$. But the entries of $D$ and $D'$ are the eigenvalues of $A^*A$ and $AA^*$, and it is well-known that the eigenvalues of $AB$ and $BA$ coincide for any square matrices $A,B$. Thus $D=D'$ so $X^*A^*AX=Y^*AA^*Y$. Letting $W=XY^*$ which is unitary we get $W^*A^*AW=AA^*=Y(X^*A^*AX)Y^*=AA^*$ as desired.

2

You can prove this by first proving that a square matrix can be made upper triangular using only row operations. Then you say that a left multiplication by a unitary matrix is equivalent to a sequence of row operations. Once this is done, you take a transpose of the resulting upper triangular matrix and apply the same method to the transposed matrix and then use the property that $(AB)^{t} = B^{t}A^{t}$ along with the property that a right multiplication is equivalent to a sequence of column operations. Alternatively, you can prove that the column transformation can be used to transform the matrix into a lower triangular one.