0
$\begingroup$

Let $T$ be an arbitrary operator on a finite dimensional inner product space $(V,\langle\,,\,\rangle)$. Set $R=(1/2)(T^*+T)$, $S=(1/2)i(-T+T^*)$. Prove that if $T=R_1+iS_1$, where $R_1$, $S_1$ are self-adjoint, then $R_1=R$, $S_1=S$

($T^*$ is the adjoint of $T$)

  • 0
    $T^*=R_1^*-iS_1^*=R_1-iS_1$. Now solve for $R_1$ and $S_1$.2012-11-30

2 Answers 2

0

Taking the dual $d:T\mapsto T^*$ is an $\mathbf R$-linear involution on the space $\def\End{\operatorname{End}}\End_\mathbf C V$ of all complex-linear operators on $V$ (considered here as a real vector space), that is $d$ is linear with respect to real scalars and it satisfies $d\circ d=\operatorname{id}_{\End V}$. Now linear involutions are always* diagonalisable with no other eigenvalues than $1$ and $-1$ (because the polynomial $X^2-1$ that annihilates involutions has those values as simple roots). The eigenvalues for $\lambda=1$ are the self-adjoint operators, the eigenvalues for $\lambda=-1$ are the anti-self-adjoint operators. Moreover multiplication by $i$ interchanges the (real) spaces of self-adjoint and anti-self-adjoint operators.

Now since $d$ is diagonalisable, $\End_\mathbf C V$ is the sum of these two eigenspaces (actually, the easily verified fact that for all $T$ one has $T=R+iS$ with $R,S$ as in the question already shows this), and a sum of eigenspaces is always a direct sum. This means that $R$ and $S$ in this decomposition, under the requirement of being self-adjoint, are always unique Q.E.D.

Added. I just wanted to give the general background, but since you asked about an explicit proof, it just boils down to showing that the sum of the subspaces of self-adjoint and anti-self-adjoint operators is direct. This means showing that if $0=R_0+iS_0$ with $R_0,S_0$ self-adjoint, then $R_0=S_0=0$. But with $-R_0=iS_0$ the first member is self-adjoint and the second is anti-self-adjoint, and they are equal; that can only happen for the null operator.

*except in characteristic $2$

  • 0
    At your service.2012-11-30
0

Assuming $T=R_1+iS_1$, with $R_1$ and $S_1$ self-adjoint, for $u,v\in V$ we have $ \langle S_1(u),v\rangle=\langle -iT(u)+iR_1(u),v\rangle =-i\langle T(u),v\rangle +i\langle R_1(u),v\rangle. $

Also $ \langle S_1(u),v\rangle=\langle u,S_1(v)\rangle=\langle u,-iT(v)+iR_1(v)\rangle=i\langle u,T(v)\rangle-i\langle u,R_1(v)\rangle=i\langle T^*(u),v\rangle-i\langle R_1(u),v\rangle. $ Thus $ 0=-i\langle T(u),v\rangle +i\langle R_1(u),v\rangle-i\langle T^*(u),v\rangle+i\langle R_1(u),v\rangle=\langle -iT(u)-iT^*(u)+2iR_1(u),v\rangle. $ In particular, $ 0=\langle i(2R_1-(T+T^*))(u),i(2R_1-(T+T^*))(u)\rangle, $ so $i(2R_1-(T+T^*))(u)=0.$ Therefore $ R_1=\frac{1}{2}(T+T^*). $

Proving $S_1$ is similar.