1
$\begingroup$

Somewhere in my notes the following formula appears

$\int_0^1 e^{s R} \frac{d R}{dt} e^{(1-s)R} ds = \frac{d}{dt} e^{R}$

where $R$ depends on $t$, and has values in a Lie algebra [$\mathfrak{so}(3)$ is what I was dealing with at the time]. If I assume this formula is true, I can make great simplifications in my work, so it's very important to me. But I have two problems.

First, I can't remember where I got this. I've looked carefully through my references, but can't find it anywhere. Any good references for this formula?

Second, (though a good reference should cure this problem) I can't figure out why it's true. When I use the "Hadamard" Lemma, and expand both sides, I get vaguely similar-looking formulas, but can't quite get to equality. Any ideas?

[On a related note, it seems like this use of integrals from 0 to 1 is pretty common in analysis of Lie groups/algebras. Any good (but fairly basic) references that discuss this strategy per se, and its usefulness?]

1 Answers 1

1

I've figured it out. I found a similar expression as Lemma 5.4 in Miller's "Symmetry Groups and Their Applications". The proof is easy, if you move one power of the exponent from the left side and apply the 'make things a function of $s$' trick to the right side.

Define \begin{equation} B(s,t) = \left( \frac{d}{dt} e^{s R(t)} \right) e^{-s R(t)} \end{equation} Now we evaluate the derivative with respect to $s$: \begin{aligned} \frac{\partial} {\partial s} B(s,t) &= \left[ \frac{d}{dt} \left( R(t) e^{s R(t)} \right) \right] e^{-s R(t)} - \left( \frac{d}{dt} e^{s R(t)} \right) e^{-s R(t)} R(t) \\ &= \left[ \left( \frac{d}{dt} R(t) \right) e^{s R(t)} + R(t) \frac{d}{dt} \left( e^{s R(t)} \right) \right] e^{-s R(t)} - B(s,t) R(t) \\ &= \dot{R}(t) + R(t) B(s,t) - B(s,t) R(t) \\ &= \dot{R}(t) + \mathrm{ad}_{R(t)} B(s,t) \end{aligned} Here, $\dot{R}(t)$ is short-hand for the derivative by $t$, and $\mathrm{ad}_X Y$ is just defined as $X\,Y - Y\,X$. Differentiating repeatedly, we see that \begin{equation} \frac{\partial^j} {\partial s^j} B(s,t) = \mathrm{ad}_{R(t)}^{j-1} \dot{R}(t) + \mathrm{ad}_{R(t)}^j B(s,t) \end{equation}

Since $B(s,t)$ is an entire function, we can expand it as a series in $s$. The various coefficients are what we have just calculated, evaluated at $s=0$. Then, the quantity we seek is just $B(1,t)$. \begin{aligned} \left( \frac{d}{dt} e^{R(t)} \right) e^{-R(t)} &= B(1,t) \\ &= \left. \sum_{j=1}^\infty \frac{s^j} {j!} \mathrm{ad}_{R(t)}^{j-1} \dot{R}(t) \right|_{s=1} \\ &= \int_0^1 \sum_{j=0}^\infty \frac{s^j} {j!} \mathrm{ad}_{R(t)}^{j} \dot{R}(t)\, ds \\ &= \int_0^1 e^{s R(t)}\, \frac{d R(t)}{dt}\, e^{-s R(t)}\, ds \end{aligned} That last equality comes from the "Hadamard" lemma I mentioned above. Multiplying each side of this equation on the right by $e^{R(t)}$ results in the formula as posed in my question.