4
$\begingroup$

I took an introductory class on dynamical systems this semester. In class we've seen a great deal about the entropy of a dynamical system $T : X \to X$ on a metric space $(X, d)$. In the topological case the entropy is defined to be $$h(T) := \lim_{\epsilon \to 0 } \lim_{k \to \infty} \frac{1}{k} \log (\text{sep}(k, \epsilon, T)),$$ where $\text{sep}(k, \epsilon, T)$ is defined to be the maximal cardinality of a subset $B \subset X$ such that for all $x, y \in B$ there is a $i = 0, ..., k - 1$ such that $d(T^i(x), T^i(y)) > \epsilon$.

In class it was said that the entropy measures how "chaotic" or "random" a map is. However, we never really saw any theorem or application that made clear why people care about entropy. It seems a rather arbitrary definition to me.

Can anyone explain to me why the topological entropy of a dynamical system is interesting?

Thanks!

  • 1
    as a side note, entropy historically played an important role in the development of the theory as a topological invariant. indeed, for some time dynamical systems people could not prove that the shift on 2 and 3 symbols respectively were not topologically conjugate (which was perceived as an embarrassingly simple question). the answer came with the notion of entropy: the shift on 2 symbols has entropy $\log 2$, the one on 3 symbols has entropy $\log 3$. Problem solved!2017-01-16
  • 4
    @Glougloubarbaki There was indeed such a historical role of entropy, but it was of the *metric entropy*, not of the *topological entropy* (indeed by the young Sinai, while student of Kolmogorov). Note that the shifts on $2$ and $3$ symbols have respectively $2$ and $3$ fixed points, for example, and so are not topologically conjugate. ;)2017-01-16
  • 0
    @JohnB erm... that's a good point! I guess I should think a bit more before I post stuff I vaguely remember reading about...2017-01-17

2 Answers 2

7

Topological entropy is one of many invariants of topological conjugacy.

As all others it has two main applications:

$1)$ distinguish dynamics that are not topological conjugate (if $h(T)\ne h(S)$, then $T$ and $S$ are not topologically conjugate);

$2)$ detect complicated behavior (if $h(T)\ne0$, then $T$ has "complicated behavior").

The second aspect is extremely technical and is one of the difficult topics of the finite-dimensional dynamical systems theory. Let me formulate one rigorous result (among others), which may let us appreciate the interest in the notion, but also its tecnhical nature:

If $T$ is a $C^1$ diffeomorphism of a compact manifold and $h(T)>0$, then there exists an ergodic $T$-invariant probability measure with at least one positive Lyapunov exponent.

2

This is a bit more informal, but the following interpretation was helpful to me as a student:


Think of $\epsilon$ as the resolution with which you can view the dynamics: that is, if two points are closer than $\epsilon$, then you cannot distinguish them. Then, \begin{align*} \operatorname{sep}(k, \epsilon, T) \end{align*} can be thought of as the number of "distinguishable trajectories of length $k$". Positivity of the entropy $h(T)$ implies that for $\epsilon > 0$ sufficiently small, this quantity grows at a positive exponential rate.

What is the implication? Well, suppose you're modelling the dynamics of $T$ on a computer and want to capture all trajectories of length $n$. If $h(T) > 0$, then the amount of data required to do so grows exponentially in $n$, at a rate $\sim e^{n h(T)}$. This is why one often refers to entropy as capturing complexity of the system.