5
$\begingroup$

I am asked to find $e^{At}$, where

$A = \begin{bmatrix} 1 & -1 & 1 & 0\\ 1 & 1 & 0 & 1\\ 0 & 0 & 1 & -1\\ 0 & 0 & 1 & 1\\ \end{bmatrix}$.

So let me just find $e^{A}$ for now and I can generalize later.

I notice right away that I can write

$A = \begin{bmatrix} B & I_{2}\\ 0_{22} & B\\ \end{bmatrix}$, where

$B = \begin{bmatrix} 1 & -1\\ 1 & 1\\ \end{bmatrix}$.

I'm sort of making up a method here and I hope it works. Can someone tell me if this is correct?

I write:

$A = \mathrm{diag}(B,B) + \begin{bmatrix}0_{22} & I_{2}\\ 0_{22} & 0_{22}\end{bmatrix}$

Call $S = \mathrm{diag}(B,B)$, and $N = \begin{bmatrix}0_{22} & I_{2}\\ 0_{22} & 0_{22}\end{bmatrix}$

I note that $N^2$ is $0_{44}$, so $e^{N} = \frac{N^{0}}{0!} + \frac{N}{1!} + \frac{N^2}{2!} + \cdots = I_{4} + N + 0_{44} + ... = I_{4} + N$

and that $e^{S} = \mathrm{diag}(e^{B}, e^{B})$ and compute:

$e^{A} = e^{S + N} = e^{S}e^{N} = \mathrm{diag}(e^{B}, e^{B})\cdot[I_{4} + N]$

This reduces the problem to finding $e^B$, which is much easier.

Is my logic correct? I just started writing everything as a block matrix and proceeded as if nothing about the process of finding the exponential of a matrix would change. But I don't really know the theory behind this I'm just guessing how it would work.

  • 2
    This looks correct to me.2011-10-16
  • 5
    @Kyle, I have just two comments: 1. when you expand the exponential, the LHS should be $e^N$ (not $e^B$) and 2. In your case, $S$ and $N$ commutes, thats why $e^{S+N}=e^Se^N$ works.2011-10-16
  • 0
    thanks for noticing that mistake. also this is probably a dumb question, but any matrix commutes with a diagonal matrix right? (which would be why $S$ and $N$ are guaranteed to commute)?2011-10-16
  • 0
    In general a diagonal matrix doesn't commute with all matrices. The diagonal entries must also be equal2011-10-16
  • 0
    Noting that you can multiply block matrices by multiplying the blocks, we see that for any square matrix $B$, $\operatorname{diag}(B,B,B,\ldots)$ will commute with any matrix whose individual blocks commute with $B$ (just multiply each block by $B$). So $S$ commutes with $N$ precisely because $B$ commutes with $I_2$ and $0_{22}$. More generally, to say useful things about whether two block matrices commute, you want to start with the blocks in the first matrix commuting with blocks in the second matrix. This is not a necessary condition, but the beginning of some sufficient ones.2011-10-16

2 Answers 2

3

A different, but rather specific, strategy would be to use the ring homomorphism $${a+bi\in\mathbb C \mapsto \pmatrix{a&-b \\ b&a}\in\mathbb R^{2\times 2}}$$in the block decomposition. Then your problem is equivalent to finding $$e^{t\pmatrix{1+i & 1\\ 0 & 1+i}}=e^{\pmatrix{t+ti & t\\ 0 & t+ti}}=e^{t+ti}e^{\pmatrix{0&t\\0&0}}=(e^{t+ti})\pmatrix{1&t\\0&1}$$ which unfolds to $$\pmatrix{e^t\cos t & -e^t\sin t & t e^t \cos t & -t e^t \sin t \\ e^t \sin t & e^t \cos t & t e^t \sin t & t e^t \cos t \\ 0 & 0 & e^t\cos t & -e^t\sin t \\ 0&0& e^t\sin t & e^t\cos t }$$

3

Consider $M(t) = \exp(t A)$, and as you noticed, it has block-diagonal form $$ M(t) = \left(\begin{array}{cc} \exp(t B) & n(t) \\ 0_{2 \times 2} & \exp(t B) \end{array} \right). $$ Notice that $M^\prime(t) = A \cdot M(t)$, and this results in a the following differential equation for $n(t)$ matrix: $$ n^\prime(t) = \mathbb{I}_{2 \times 2} \cdot \exp(t B) + B \cdot n(t) $$ which translates into $$ \frac{\mathrm{d}}{\mathrm{d} t} \left( \exp(-t B) n(t) \right) = \mathbb{I}_{2 \times 2} $$ which is to say that $n(t) = t \exp(t B)$.