3
$\begingroup$

Matrices $A$ in the special unitary group $SU(2)$ have determinant $\operatorname{det}(A) = 1$ and satisfy $AA^\dagger = I$.

I want to show that $A$ is of the form $\begin{pmatrix} a & -b^* \\ b & a^*\end{pmatrix}$ with complex numbers $a,b$ such that $|a|^2+|b|^2 = 1$.


To this end, we put $A:= \begin{pmatrix} r & s \\ t & u\end{pmatrix}$ and impose the two properties.

This yields \begin{align}\operatorname{det}(A) &= ru-st \\ &= 1 \ ,\end{align} and \begin{align} AA^\dagger &= \begin{pmatrix} r & s \\ t & u\end{pmatrix} \begin{pmatrix} r^* & t^* \\ s^* & u^* \end{pmatrix} \\&= \begin{pmatrix} |r|^2+|s|^2 & rt^* +su^* \\ tr^*+us^* & |t|^2 + |u|^2\end{pmatrix} \\ &= \begin{pmatrix} 1 & 0 \\ 0 & 1\end{pmatrix} \ .\\ \end{align} The latter gives rise to \begin{align} |r|^2+|s|^2 &= 1 \\ &= |t|^2+|u|^2 \ , \end{align} and \begin{align} tr^*+us^* &= 0 \\ &= rt^*+su^* \ . \end{align}


At this point, I don't know how to proceed. Any hints would be appreciated.


@Omnomnomnom's remark \begin{align} A A^\dagger &= \begin{pmatrix} |r|^2+|s|^2 & rt^* +su^* \\ tr^*+us^* & |t|^2 + |u|^2\end{pmatrix} \\ &= \begin{pmatrix} |r|^2+|t|^2 & sr^* +ut^* \\ rs^*+tu^* & |s|^2 + |u|^2\end{pmatrix} = A^\dagger A \ , \end{align} gives rise to

$$ |t|^2 = |s|^2 \\ |r|^2 = |u|^2 $$

and $$ AA^\dagger :\begin{pmatrix} rt^* +su^* = sr^* +ut^* \\ tr^*+us^* = rs^*+tu^* \end{pmatrix}: A^\dagger A $$


At this point, I'm looking in to find a relation between $t,s$ and $r,u$ respectively.

  • 0
    A handy trick is to use the fact that we must also have $A^\dagger A = I$2017-02-07

3 Answers 3

3

The condition $A^{\ast}A=I$ says that $A$ has orthonormal columns.

Suppose the first column is $v=[\begin{smallmatrix}a\\b\end{smallmatrix}]$. It must have unit norm, so $|a|^2+|b|^2=1$. What can the second column be? It must be orthogonal to the first, which means it must be in the complex one-dimensional orthogonal complement. Thus, if $w$ is orthogonal to $v$, then the possibilities for the second column are $\lambda w$ for $\lambda\in\mathbb{C}$. Since $\det[v~\lambda w]=\lambda\det[v~w]$, only one value of $\lambda$ will make the determinant $1$, hence the second column is unique. So it suffices to check $w=[-b ~~ a]^{\ast}$ works, which is natural to check because in ${\rm SO}(2)$ the second column would be $[-b~~a]^T$.

  • 0
    We are discussing matrices in $SU(2)$ which require $A=A^\dagger$, as opposed to $A=A^\top$ for $SO(2)$.2017-02-08
  • 0
    @MusséRedi You mean $A^{\dagger}A=I$. I am perfectly aware we are discussing ${\rm SU}(2)$ instead of ${\rm SO}(2)$. It is important to find inspiration and learn from previous experience wherever possible, hence my stating the connection to ${\rm SO}(2)$.2017-02-08
  • 0
    Could you elaborate on *if $w$ is orthogonal to $v$, then the possibilities for the second column are $\lambda w$ for $\lambda \in \mathbb{C}$*?2017-02-08
  • 0
    @MusséRedi The set of orthogonal vectors has complex dimension one, which means every orthogonal vector is a scalar multiple of a given (nonzero) one.2017-02-08
2

Using @Omnomnomnom's suggestion $AA^\dagger =A^\dagger A$, we first obtain the relations \begin{align} AA^\dagger: r &= -\frac{su^*}{t^*}\ , \ u= -\frac{tr^*}{s^*} \\ A^\dagger A: r &= -\frac{tu^*}{s^*}\ , \ u= -\frac{sr^*}{t^*} \ . \end{align} Noticing the common factor $\frac{-t}{s^*}$ for $r_{A^\dagger A}$ and $u_{AA^\dagger}$, we put $x:=\frac{-t}{s^*}$.

This allows us to write $u = xr^*$.

Similarly, we have \begin{align} AA^\dagger: s &= -\frac{rt^*}{t^*}\ , \ t= -\frac{us^*}{s^*} \\ A^\dagger A: s &= -\frac{ut^*}{s^*}\ , \ t= -\frac{rs^*}{t^*} \ , \end{align} and $y:= \frac{-u}{s^*}$. Which yields $s = yt^*$.

Hence, so far, we have $$ A = \begin{pmatrix}r & yt^* \\ t & xr^*\end{pmatrix} \ . $$

We now notice that, in fact, we have $$ y = -\frac{u}{r^*} = -\frac{(xr^*)}{r^*} = -x \ . $$

Our matrix now looks like $$ A = \begin{pmatrix}r & -xt^* \\ t & xr^*\end{pmatrix} \ . $$

Now, finally, at last, we use $\operatorname{det}(A) = 1$ to show that $x=1$: \begin{align} \operatorname{det}(A) &= 1 \\ &= x(|r|^2+|t|^2) \\ &= x \cdot 1 \ . \end{align}

We now conclude with $$ A = \begin{pmatrix}r & -t^* \\ t & r^*\end{pmatrix} \ . $$

1

We have $tr^\ast=-us^\ast$ so $\left| r\right|^2 \left| t\right|^2 = \left| s\right|^2 \left| u\right|^2$ and $\left| r\right|^2 -\left| r\right|^2\left| u\right|^2 = \left| s\right|^2 \left| u\right|^2$ so $\left| r\right|^2 =\left| u\right|^2$. Hence $r,\,u$ have the same modulus, as do $s,\,t$.

If $tu\ne 0$ define $k:=\dfrac{r^\ast}{u}=-\dfrac{s^\ast}{t}$ so $u=\dfrac{r^\ast}{k},\,1=\dfrac{r^\ast r+s^\ast s}{k}$ and $k=1$. Hence $u=r^\ast$ and similarly $s^\ast=-t$.

If $u=0$ $st=-1$ with $\left| s\right|=\left| t\right|=1$ so $s^\ast=-t$, and $\left| r\right|=\left| u\right|=0$ so $u=r^\ast$.

If $t=0$ then $ru=1$ so $u=r^{-1}=r^\ast$ and $s^\ast=0=-t$ because $\left| s\right|=\left| t\right|$.

  • 0
    How did you arrive at $|r|^2-|r|^2|u|^2=|s|^2|u|^2$?2017-02-07
  • 0
    @MusséRedi By replacing $\left| t\right|^2$ with $1-\left| u\right|^2$.2017-02-07
  • 0
    The claim in the question is correct. You cannot multiply whole matrix by $e^{it}$ if $t$ is not a multiple of $2n\pi$, because that will ruin the determinant being $1$: in that case $\det A=e^{2it}$2017-02-07
  • 0
    @J.G. How did you arrive at $|r|^2 = |u|^2$?2017-02-08
  • 0
    Using $|r|^2+|s|^2=1$.2017-02-08
  • 0
    @J.G. I'm interested in how you came about considering the cases $tu\neq0$, $u=0$ and $t=0$. How would this follow naturally?2017-02-08
  • 0
    @MusséRedi Rearranging $tr^\ast=-us^ast$ by division by $tu$ is possible iff $tu\ne 0$. We have to subdivide $tu=0$ into two cases.2017-02-08