2
$\begingroup$

I recently studied these two concepts: topological entropy $h(T)$ for a continuous map $T:(X,d)\to (X,d)$ and measure theoretical entropy $h_{\mu}(T)$ for a measure-preserving transformation $T$. The formal one captures the complexity of a topological dynamical system by considering the exponential growth of the expansion of $T$. On the other hand, measure theoretical entropy is a quantity description of the uncertainty of a system.

Could anyone explain what's the relation between these two definition ? I know there's a variational principle stating one is an upper bound of the others. But what I want is an explanation that can also relates the quantities they characterize.

1 Answers 1

1

The variational principle says that:

The supremum of the metric entropies over all invariant probability measures is equal to the topological entropy.

First a minor correction: one should not say that the topological entropy has to do with the "expansion of $T$". Note that the notion is purely topological: it is an invariant of topological conjugacy and in fact it only depends on the topology on the space, not on the distance. Instead, we should see topological entropy it as a measure of how points get separated under iteration of the dynamics.

From a similar point of view, metric entropy measures how typical points for a given measure get separated under iteration of the dynamics. Indeed, there exists an equivalent description of metric entropy in terms of how some sets centered at the images of finitely many points under iteration cover a set of sufficiently large measure.

Summing up:

The variational principle tells us that the "speed" at which points get separated under iteration of the dynamics is obtained by maximizing over all invariant measures the "speed" at which typical points for an invariant measure get separated under iteration of the dynamics.

I write "speed" because I already noted that the we do not need a distance and so it needs to be described, in general, in terms of covers.

If you prefer we can instead say that the maximum of information is obtained at the topological level and that it is the supremum of the informations obtained at the measure-theoretical level for each finite invariant measure. Also, if you prefer, we can replace "information" by "uncertainty", this is really more of philosophical nature (I mean, whether it is better to use one or the other; mathematicians should prefer the first, while physicists should be more inclined for the second).

  • 0
    Thanks! What do you mean by "typical points"? And could you kindly tell me where I can find the statement of the equivalent description of the metric entropy discussed in your answer?2017-01-06
  • 1
    Since I gave an informal description, one could say that "typical" means a Lebesgue density point for the measure (or on some set of large measure given by Lusin's theorem). The original reference for the equivalent description is: A. Katok, *Lyapunov exponents, entropy and periodic orbits for diffeomorphisms*, Inst. Hautes Études Sci. Publ. Math. **51** (1980), 137-173.2017-01-06
  • 0
    Link: http://archive.numdam.org/ARCHIVE/PMIHES/PMIHES_1980__51_/PMIHES_1980__51__137_0/PMIHES_1980__51__137_0.pdf2017-01-06