1
$\begingroup$

Suppose you multiply the values of $f$ between $a$ and $b$ at intervals of $\Delta x = (b-a)/n$ and then raise the product to the power $\Delta x$, and take the limit as $n\to\infty$. What you get would bear the same relation to products that integrals bear to sums, and there's a trivial reduction to ordinary integrals, in that what you get is $\displaystyle\exp\int_a^b\log f$, provided the addition and multiplication are reasonably like the ordinary ones on real numbers.

But what if the things getting multiplied are matrices, so the multiplication is non-commutative? I think I've seen it asserted somewhere ("somewhere" is a horrible word sometimes, isn't it?) that that is when such "product integrals" are not trivially reducible to more familiar things.

Last time I checked, Wikipedia's article titled product integral didn't go into matrix products, but only products of real numbers. If what I think I heard somewhere is right, then that makes it not all that good an article.

So:

  • Is what I think I heard correct?
  • If so, where's the literature on this?
  • And what are such matrix-product integrals used for? Or, what is done with them?
  • 0
    [Somewhere](http://math.stackexchange.com/questions/177802/about-differentiation-under-the-product-integral-sign/177813#177813), you say... :) $~$ | for the first bullet, we may be able to check that the obvious formula doesn't work in noncommutative settings by doing some calculations with matrix Lie groups / algebras (and paths that are not too straightforward). On a related note, the BCH formula gauges to what extent noncommutativity exists with matrix exponentials.2012-08-15
  • 0
    @anon : Obviously that is _not_ where I heard that. If you read the posting you linked to, you see that it deals only with the commutative case. Besides, I heard this innumerable aeons ago, not just this month.2012-08-15
  • 0
    My memory is fuzzy; apparently I only hinted at the fact the formula doesn't always work in noncommutative settings, instead of outright state this. Oh well. | At any rate, I'm too lazy to do the computations I refer to, so I'll let someone else do it. :)2012-08-15
  • 1
    I can only guess. Isn't it used in continuous-time Markov process with time-dependent transition matrix? (Not that I know anything about it.)2012-08-15
  • 0
    At the end of your product integral link there was this [link](http://en.wikipedia.org/wiki/Ordered_exponential) to 'ordered exponential' which gave the [link](http://en.wikipedia.org/wiki/Path-ordering) to 'path ordering'. The 'time-ordered' product of operators was proposed by Richard Feynman in the early fifties for QFT ; he had earlier proposed the very fruitful concept of [path integrals](http://en.wikipedia.org/wiki/Path_integral_formulation)but this is a little different...2012-08-15
  • 0
    I have answered to a similar question here: http://math.stackexchange.com/questions/468547/matrix-product-integrals/537571#537571 .2013-10-23

1 Answers 1

3

I think you're talking about "time-ordered exponentials". These are used a lot in physics.

EDIT: Mathematically, the finite-dimensional case is just the fundamental matrix for a first-order linear ODE system: $\Psi(t)$ is the solution of $\Psi'(t) = A(t) \Psi(t)$ with $\Psi(0) = I$, where $A$ is a continuous function of time $t$ with values in the $n \times n$ matrices. One way to obtain it is as $$ \Psi(t) = \lim_{n \to \infty} e^{A((n-1)t/n)/n} e^{A((n-2)t/n)/n} \ldots e^{A(t/n)/n} e^{A(0)/n} $$ Of course physicists are prone to use it in infinite-dimensional settings with unbounded operators $A(t)$, where things can be trickier.

You might look at H. Araki, "Expansional in banach algebras", Annales scientifiques de l'École Normale Supérieure, Sér. 4, 6 no. 1 (1973), p. 67-84 http://archive.numdam.org/ARCHIVE/ASENS/ASENS_1973_4_6_1/ASENS_1973_4_6_1_67_0/ASENS_1973_4_6_1_67_0.pdf

  • 0
    +1, but I wonder if this can be made more self-contained, either saying what those are or saying in which book if any I'll find them explained.2012-08-15