29
$\begingroup$

suppose $A \in M_{2n}(\mathbb{R})$. and$J=\begin{pmatrix} 0 & E_n\\ -E_n&0 \end{pmatrix}$ where $E_n$ represents identity matrix.

if $A$ satisfies $AJA^T=J$

How to figure out $\det(A)=1$

My approach:

I have tried to separate $A$ into four submartix:$A=\begin{pmatrix}A_1&A_2 \\A_3&A_4 \end{pmatrix}$ and I must add a assumption that $A_1$ is invertible. by elementary transfromation:$\begin{pmatrix}A_1&A_2 \\ A_3&A_4\end{pmatrix}\rightarrow \begin{pmatrix}A_1&A_2 \\ 0&A_4-A_3A_1^{-1}A_2\end{pmatrix}$

we have: $\det(A)=\det(A_1)\det(A_4-A_3A_1^{-1}A_2)$ from$\begin{pmatrix}A_1&A_2 \\ A_3&A_4\end{pmatrix}\begin{pmatrix}0&E_n \\ -E_n&0\end{pmatrix}\begin{pmatrix}A_1&A_2 \\ A_3&A_4\end{pmatrix}^T=\begin{pmatrix}0&E_n \\ -E_n&0\end{pmatrix}$ we get two equalities:$A_1A_2^T=A_2A_1^T$ and $A_1A_4^T-A_2A_3^T=E_n$

then $\det(A)=\det(A_1(A_4-A_3A_1^{-1}A_2)^T)=\det(A_1A_4^T-A_1A_2^T(A_1^T)^{-1}A_3^T)=\det(A_1A_4^T-A_2A_1^T(A_1^T)^{-1}A_3^T)=\det(E_n)=1$

but I have no idea to deal with this problem when $A_1$ is not invertible...

Thanks

  • 1
    Another remark: You may assume that $A_1$ is invertible since it can be approximated by invertible matrices.2013-01-03

6 Answers 6

18

First, taking the determinant of the condition $ \det AJA^T = \det J \implies \det A^TA = 1 $ using that $\det J \neq 0$. This immediately implies $ \det A = \pm 1$ if $A$ is real valued. The quickest way, if you know it, to show that the determinant is positive is via the Pfaffian of the expression $A J A^T = J$.

  • 0
    Gotcha. The Pfaffian can be zero. My bad. Thanks!2013-10-18
15

Let me first restate your question in a somewhat more abstract way. Let $V$ be a finite dimensional real vector space. A sympletic form is a 2-form $\omega\in \Lambda^2(V^\vee)$ which is non-degenerate in the sense that $\omega(x,y)=0$ for all $y\in V$ implies that $x=0$. $V$ together with such a specified $\omega$ nondegerate 2-form is called a symplectic space. It can be shown that $V$ must be of even dimension, say, $2n$.

A linear operator $T:V\to V$ is said to be a symplectic transformations if $\omega(x,y)=\omega(Tx,Ty)$ for all $x,y\in V$. This is the same as saying $T^*\omega=\omega$. What you want to show is that $T$ is orientation preserving. Now I claim that $\omega^n\neq 0$. This can be shown by choosing a basis $\{a_i,b_j|i,j=1,\ldots,n\}$ such that $\omega(a_i,b_j)=\delta_{ij}$ and $\omega(a_i,a_j)=\omega(b_i,b_j)=0$, for all $i,j=1,\ldots,n $. Then $\omega=\sum_ia_i^\vee\wedge b_i^\vee$, where $\{a_i^\vee,b_j^\vee\}$ is the dual basis. We can compute $\omega^n=n!a_1^\vee\wedge b_1^\vee\wedge\dots\wedge a_n^\vee \wedge b_n^\vee$, which is clearly nonzero.

Now let me digress to say a word about determinants. Let $W$ be an n-dimensional vector space and $f:W\to W$ be linear. Then we have induced maps $f_*:\Lambda^n(W)\to \Lambda^n(W)$. Since $\Lambda^n(W)$ is 1-dimensional, $f_*$ is multiplication by a number. This is just the determinant of $f$. And the dual map $f^*:\Lambda^n(W^\vee)\to \Lambda^n(W^\vee)$ is also multiplication by the determinant of $f$.

Since $T^*(\omega^n)=\omega^n$, we can see from the above argument that $\det(T)=1$. The key point here is that the sympletic form $\omega$ give a canonical orientation of the space, via the top from $\omega^n$.

  • 0
    Correction: A symplectic for is an **alternating** two-form ...2017-07-23
10

The determinant is a continuous function, and the set of symplectic matrices with invertible $A_1$ is dense in the set of all symplectic matrices. So if you've proven that it equals 1 for all invertible $A_1$, then it equals 1 for all $A_1$.

  • 0
    Maybe I should ask this as a separate question so if you do remember the argument later you can get more recognition for it? It leads to a very nice and clean argument to prove the overall result, since the proof that the determinant is 1 when $A_1$ has non-zero determinant is very simple.2016-07-22
5

There is an easy proof for real and complex case which does not require the use of Pfaffians. This proof first appeared in a Chinese text. Please see http://arxiv.org/abs/1505.04240 for the reference.

I reproduce the proof for the real case here. The approach extends to complex symplectic matrices.

Taking the determinant on both sides of $A^T J A = J$, $\det(A^T J A) = \det(A^T) \det(J) \det(A) = \det(J).$ So we immediately have that $\det(A) = \pm 1$.

Then let us consider the matrix $A^TA + I.$ Since $A^TA$ is symmetric positive definite,
its eigenvalues are real and greater than $1$.Therefore its determinant, being the product of its eigenvalues, has $\det(A^TA +I) > 1$.

Now as $\det(A) \ne 0$, $A$ is invertible. Using this we may write $ A^TA + I = A^T( A + A^{-T}) = A^T(A + JAJ^{-1}).$ Denote the four $N \times N$ subblocks of $A$ as follows, $ A = \begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{bmatrix}, \text{ where } A_{11},A_{12},A_{21},A_{22} \in \mathbb{R}^{N \times N}. $ Then we compute $ A + JAJ^{-1} = \begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{bmatrix} + \begin{bmatrix} O & I_N \\ -I_N & O \end{bmatrix} \begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{bmatrix} \begin{bmatrix} O & - I_N \\ I_N & O \end{bmatrix} = \begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{bmatrix} + \begin{bmatrix} A_{22} & -A_{21} \\ -A_{12} & A_{11} \end{bmatrix} = \begin{bmatrix} A_{11}+ A_{22} & A_{12} - A_{21} \\ - A_{12}+ A_{21} & A_{11} + A_{22} \end{bmatrix}.$ Writing the blocks as $C := A_{11} + A_{22}$ and $D:= A_{12} - A_{21}$, we make use of a unitary transform $ A + JAJ^{-1} = \begin{bmatrix} C & D \\ -D & C \end{bmatrix} = \frac{1}{\sqrt{2}}\begin{bmatrix} I_N & I_N \\ iI_N & -iI_N \end{bmatrix} \begin{bmatrix} C + i D & O \\ O & C - i D \end{bmatrix} \frac{1}{\sqrt{2}} \begin{bmatrix} I_N & -iI_N \\ I_N & iI_N \end{bmatrix}. $ We plug this factorization into our identity. Note that $C,D$ are both real. This allows the complex conjugation to commute with the determinant (as it is a polynomial of its entries) $0 < 1 < \det(A^TA + I) = \det(A^T(A + JAJ^{-1})) \\ = \det(A) \det(C + i D) \det(C - iD) \\ = \det(A) \det(C + i D) \det\left(\overline{C + iD}\right)\\ = \det(A) \det(C + iD) \overline{\det(C + iD)} = \det(A) \left\lvert \det(C + iD)\right\rvert^2. $ Clearly, none of the two determinants on the RHS can be zero, so we may conclude $\left\lvert \det(C + iD) \right\rvert^2 > 0$. Dividing this through on both sides, we have $\det(A) > 0$, and thus $\det(A) = 1$.

3
  1. The most natural way to show [independent of the field $\mathbb{F}$ of characteristics $\neq 2$] that the symplectic group $Sp(2n,\mathbb{F})~:=~\{M\in {\rm Mat}_{2n\times 2n}(\mathbb{F})\mid MJM^T=J\} \tag{1}$ consists of matrices $M$ with unit determinant is to use the following Pfaffian property $ {\rm Pf}(J)~\stackrel{(1)}{=}~{\rm Pf}(MJM^T) =~{\rm Det}(M)~{\rm Pf}(J) \qquad \Longrightarrow\qquad {\rm Det}(M)~=~+1,\tag{2}$ as Willie Wong hints in his answer.

  2. An elementary proof [independent of the field $\mathbb{F}$ of characteristics $=0$] of the Pfaffian property (2) is e.g. given in my Math.SE answer here.

0

Every symplectic matrix is the product of two symplectic matrices with lower-left corner invertible. See: M. de Gosson, s Symplectic Geometry and Quantum Mechanics, Birkhäuser, Basel, series "Operator Theory: Advances and Applications" (subseries: "Advances in Partial Differential Equations"), Vol. 166 (2006)