2
$\begingroup$

$p_i$ is the probability of the occurence of unique symbol $i$. In the case of Tent map, the Lyapunov exponent (LE) is log of the derivative of the tent map which is almost always 2. So, $\lambda = log(2)$.

For the probabilistic Bernoulli map, I tried to find the derivative like this: $f'(x) = 1/p_1$ for $0

In K Feltekh, D Fournier-Prunaret, S Belghith, Analytical expressions for power spectral density issued from one-dimensional continuous piecewise linear maps with three slopes. Signal Process. 94:, 149–157 (2014).

the anlytical expression for LE for any piecewise linear map in general is $\lambda = (1-p)\ln(2/(1-p)) + p\ln(1/p)$ where $p$ is a constant and $p \in (0,1)$.

Then, How can I apply the above result and calculate the LE for the map in Eq(6)? Since, Tent map is conjugate to Bernoulli map, so would the probabilistic Bernoulli map have the same LE as Tent Map which is log(2)?

  • 0
    You need to correct the derivative in the second interval, right?2017-02-07

1 Answers 1

1

The Lyapunov exponent is $$ \sum_{i=1}^m p_i \log (f'|I_i)=\sum_{i=1}^m p_i \log (1/p_i)=\text{entropy}. $$

More precisely, this is the averaged Lyapunov exponent. Let me explain. You want to compute $$ \int_0^1 \lambda(x)\,d\mu(x), $$ where $\mu$ is the Bernoulli measure and $$ \lambda(x)=\lim_{n\to\infty}\frac1n\log |(f^n)'(x)| $$ when the limit exists. But since $\mu$ is invariant you get $$ \begin{split} \int_0^1 \lambda(x)\,d\mu(x) &=\lim_{n\to\infty}\frac1n\int_0^1\log |(f^n)'(x)|\,d\mu(x)\\ &=\lim_{n\to\infty}\frac1n\sum_{j=0}^{n-1}\int_0^1\log |f'(f^j(x))|\,d\mu(x)\\ &=\int_0^1\log |f'(x)|\,d\mu(x)\\ &=\sum_{i=1}^m p_i \log (f'|I_i). \end{split} $$ Indeed, since $\mu$ is $f$-invariant we have $$ \int_0^1\log |f'(f^j(x))|\,d\mu(x) =\int_0^1\log |f'(x)|\,d\mu(x) $$ for any integer $j\ge0$.

  • 0
    Thanks for your answer. So, if I have understood correctly, the Lyapunov exponent can be analytically computed by just calculating the Shannon's entropy? Is my understanding correct? Could you please provide a reference where the relationship between LE = entropy can be cited?2017-02-07
  • 0
    The answer is *yes* for the **averaged** Lyapunov exponent, but *no* for the Lyapunov exponent $\lambda(x)$. I regret if I disappoint you, but unfortunately the paper that you refer to is a collection of trivial computations, which can be done because it is such a specific and simple problem (I did look at the paper out of curiosity...). No reference for this can be given because it is false in general that LE = Kolmogorov-Sinai entropy, in almost every system!2017-02-07
  • 0
    Instead, if I may suggest, writing down my computations above would be more appropriate.2017-02-07
  • 0
    Pesin's indendity says that Kolmogorov entropy is bounded by sum of Lyapunov exponents but I don't know if Kolmogorov entropy = Shannon's entropy i.e, if I compute the Shannon's source entropy then can I say LE = Shannon's entropy = Kolmogorov entropy? Any reference and your insights on this>2017-02-07
  • 0
    Pesin's identity is much more delicate than what you say, in particular it requires an invariant measure equivalent to volume and a dynamics at least $C^{1+\alpha}$ (sorry, but it doesn't say what you claim also, again you need to average). Shannon's entropy is like the pre-history of the KS entropy (information theory experienced many developments, but not his entropy, no novelty there). That's why we only refer to the Shannon entropy in *very* basic cases, like in that paper.2017-02-07
  • 0
    My suggestion is that you ask separate questions when it diverts from the original question, I will be happy to try to reply if I can. Again, at the level that you ask you can write what I wrote, and no harm if you refer to the entropy as the Shannon entropy, indeed they coincide in that case.2017-02-07
  • 0
    @SKM I regret the delay, but it was a complicated day. I looked at your question, but it is really a bit over what I am used to. I will comment there.2017-02-09