Let $\omega>0$. Then how to compute matrix $e^A$, where $$A= \begin{bmatrix} 0 & \omega \\ -\omega & 0 \end{bmatrix}. $$ I am excited to see matrix as powers. I don't know how it can be power but any way it seems good stuff to learn. How to compute it?
How to compute $e^A$?
-
0See also [here](https://en.wikipedia.org/wiki/Pauli_matrices#Exponential_of_a_Pauli_vector) – 2017-01-22
-
0I have added another representation of $exp(A)$ – 2017-01-22
4 Answers
Take the definition:
$$exp(A)=I+A+A^2/2!+A^3/3!+\cdots$$
and group even and odd terms:
$$exp(A)=(I+A^2/2!+A^4/4!+\cdots) + (A/1!+A^3/3!+A^5/5!+ \cdots)$$
Using the fact that $A^2=-\omega^2I$,
$$exp(A)=(I-w^2I/2!+w^4I/4!-\cdots)+(I-\omega^2I/3!+\omega^4I/5!+\cdots)A$$
$$exp(A)=(1-w^2/2!+w^4/4!-\cdots)I+(1-\omega^2/3!+\omega^4/5!+\cdots)A$$
$$exp(A)=\cos(\omega)I+\dfrac{\sin{\omega}}{\omega}A$$
But there is another important way to write it:
$$exp(A)=\pmatrix{\cos(\omega)&\sin(\omega)\\ -\sin(\omega)&\cos(\omega)}$$
which is the rotation with angle $-\omega$.
It is a classical (and deep) result that the exponential of an antisymmetric matrix is an orthogonal matrix. Have a small look at (http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.35.3205) that recalls what exists in dimension 2 and 3 (important Rodrigues formula) and then considers computations in higher dimensions.
$e^A$ is not really a power, but rather the exponential map. There is no such thing as "multiplying $e$ by itself $A$ times" when $A$ is a matrix. Instead, it's a notational shorthand for $$ I + A + \frac{A^2}2 + \frac{A^3}{6} + \frac{A^4}{24} + \cdots + \frac{A^n}{n!} + \cdots $$ In order to calculate this efficiently, I suggest you try to find the general form of $A^n$, and from there, the general form of each separate component of $e^A$, keeping in mind the power series definition of $\sin$ and $\cos$
As mentioned in the answer by @Arthur, the definition of $e^A$ is the power series $$e^A:=\sum_{n\ge0}\frac{1}{n!}A^n,$$ where $A^0=I$. Now for some matrices (yours, for example) one can try to find some regularities in the sequence $A,A^2,A^3,...$ and work from there. However, for matrices of reasonable size, one can proceed as follows:
- If $A$ is diagonalizable, write it as $A=PDP^{-1}$ for a diagonal matrix $D$. Else (a bit harder), find its Jordan normal form $A = PJP^{-1}$.
- It is easy to compute $D^n$ (respectively, $J^n$), and thus $$A^n = (PDP^{-1})^n = PD^nP^{-1},$$ so that one can easily find $e^A$. In particular, for a diagonal matrix $D$ we have that $e^D$ is given by the diagonal matrix with the exponential of the entries of $D$ as elements, making things extremely simple.
Of course, all the hard work here is to diagonalize $A$ (resp. take it into its Jordan normal form). For large matrices, often some probabilistic methods are used to approximate solutions. It is a very interesting (and very active) field of research, but I don't know much about it.
The characteristic equation for matrix $A$ is $t^2+\omega^2=0$ so $t=\pm \omega i$ will be eigenvalues. On finding eigenvectors $\begin{pmatrix}1\\i\\\end{pmatrix}$ and $\begin{pmatrix}i\\1\\\end{pmatrix}$, construct the modal matrix $P =\begin{pmatrix}1&i\\i&1\\\end{pmatrix}$ and obtain $P^{-1}=\frac{1}{2}\begin{pmatrix}1&-i\\-i&1\\\end{pmatrix}$.
Using $A=PDP^{-1}\implies e^A=Pe^DP^{-1}$
$\implies e^A=\begin{pmatrix}1&i\\i&1\\\end{pmatrix}\begin{pmatrix}e^{\omega i}&0\\0&e^{-\omega i}\\\end{pmatrix}\frac{1}{2}\begin{pmatrix}1&-i\\-i&1\\\end{pmatrix}$.
$\implies e^A=\begin{pmatrix}cos(\omega)&sin(\omega)\\-sin(\omega)&cos(\omega)\\\end{pmatrix}$