3
$\begingroup$

Given $$ J=\begin{bmatrix} \frac{\pi}{2}&0&0\\ 1&\frac{\pi}{2}&0\\ 0&1&\frac{\pi}{2}\\ \end{bmatrix} $$

find $\sin(J) \text{ and } \cos(J)$

I know I need to find the spectral decomposition, but I am not sure what to do because the examples and exercises in the textbook I have all have more than one eigenvalue, here there is only one eigenvalue.

  • 2
    $J=\frac{\pi}{2}I$?2012-10-01
  • 0
    You seem to be assuming you can diagonalize $J$. But $J$ isn't normal.2012-10-01
  • 0
    @PatrickLi omg i can't believe i wrote that. how do i go about answering this question? I know how to find lagrange polynomials and spectral decomposition when I have more than one eigenvalue.2012-10-01
  • 1
    If $AB=BA$ for two matrices, then $\cos(A+B)$ and $\sin(A+B)$ satisfy the usual formulas. Also, if $\sin(\theta I) = \sin(\theta) I$ when $\theta$ is real, and same for $\cos(\theta I)$. Finally, you can hand-compute $\sin(A)$ and $\cos(A)$ when $A^n=0$ for some $n$2012-10-01
  • 0
    @ThomasAndrews I actually need to prove that property you just mentioned. Can you help me with that?2012-10-01
  • 0
    Which property? And what properties do you know already?2012-10-01
  • 0
    Sarah, if you need to prove from scratch that $\sin(A+B)=\sin A\cos B+ \sin B \cos A$ when $A$ and $B$ commute, it amounts to a fiddly computation with the power series.2012-10-01
  • 0
    @KevinCarlson so does this mean I cannot use the method of spectral decomposition? How do I solve this then?2012-10-01
  • 0
    It's not so bad to prove that $e^{A+B} = e^A e^B$ when $A$ and $B$ commute by manipulating the power series - indeed, the commutativity of $A$ and $B$ reduces this to the usual proof. Then the trig formulae follow from $e^{A+iB} = \cos(A) + i\sin(B)$.2012-10-01
  • 0
    @KevinCarlson to prove the compound angle rules given, the question gives a hint : $$\sin(A)=\frac{1}{2i}(\exp^{iA}-\exp^{-iA})$$2012-10-01
  • 0
    To solve this particular problem, do what Tom said by writing $J$ as the sum of $(\pi/2) I$ and a nilpotent matrix, then computing the sin of the nilpotent matrix from the power series. If by "the method of spectral decomposition" you mean diagonalization, no, $J$ is currently as close to diagonal as it's ever going to get, up to a transposition.2012-10-01
  • 0
    Ah, so have you proven the identity @SeanEberhard mentions? If not, you may have to.2012-10-01
  • 0
    @KevinCarlson a separate question requires me to prove the compound angle formula for cos holds for matrices A and B if AB=BA. I suppose then, that the same is true for the compound angle for sin? So I will have to break up $J$ into $\frac{\pi}{2}I_{3} + J_{3}(0)$?2012-10-01
  • 0
    @KevinCarlson I have posted another question regarding the proving of the compound angle formulae. please have a look. thanks again for all the help.2012-10-01

2 Answers 2

4

As was discussed in the comments, let $K=e_{21}+e_{32}$. $J=\frac{\pi}{2}I+K$, and certainly $I$ and $K$ commute, so $$\sin J=\sin\left(\frac{\pi}{2}I+K\right)=\cos\left(\frac{\pi}{2}I\right)\sin K+ \sin\left(\frac{\pi}{2}I\right)\cos K$$ The first term vanishes since $\cos(\pi/2)$ does, and $\sin(\pi/2 I)=I$, so all we need is $\cos K$. As noted in another answer $K^n=0$ for $n\geq 3,$ so we can compute this directly from the power series: $$\sin J=\cos K=I-\frac{1}{2}K^2$$

This solution uses the angle-sum formula brought up in the comments, which again is implied by $\exp(A+B)=\exp A\exp B\ $ when $A$ and $B$ commute.

  • 0
    Shouldn't $\cos K = I-\frac{1}{2}K^2$?2012-10-01
  • 0
    Indeed, edited!2012-10-01
1

No, this is a so called Jordan block, and not much more can be done in its decomposition.

Let $A=\begin{bmatrix} 0&0&0\\1&0&0\\0&1&0 \end{bmatrix}$, and say, we have a matrix $J= A+\alpha\cdot I$. Denote the elements of the standard basis in $\mathbb R^3$ by $e_1,e_2,e_3$. Since $A$ takes $e_1\mapsto e_2$, $e_2\mapsto e_3$, $e_3\mapsto 0$, we have $A^2=\begin{bmatrix} 0&0&0\\0&0&0\\1&0&0 \end{bmatrix}$ and already $A^3=0$.

Hence, $J^2 = A^2+2\alpha A+\alpha^2 I$, $\ J^3=3\alpha A^2+3\alpha^2 A+\alpha^3 I$, and find similarly $J^n$ by the binomial theorem.

Then, substituting it in a power series we will practically get 3 distinct power series for the diagonal lanes (with coefficients of $A^2$, $A$ and $I$). Anyway, at the part 'Functions of matrices' of the wikipage, the result is described, and $f(J)$ will also contain elements related to $f'(\alpha)$ and $f''(\alpha)$. [Now $\alpha=\frac\pi2$ and $f=\sin$.]