1
$\begingroup$

Let be the sequence $(x_n)_{n\geq0}$, $x_0$ a real number, and defined as follows: $ x_{n+1} = x_n + e^{-x_n} $ Compute the limit:

$\lim_{n\to\infty} \frac{x_n}{\ln {n}}$

Luckily, I've found another post here with a very similar question. If you know other ways to solve it and want to share them then I'd be very grateful for that.

2 Answers 2

6

Heuristically, this difference equation resembles the differential equation $x'(t) = e^{-x}$, which has solutions $x(t) = \ln(t + C)$, so I would expect the answer to be $1$.

In fact, let $x_n = \ln(n + s_n)$, and $y(t) = \ln(t+s_n)$. Then for $n \le t < n+1$ we have $y(t) > y(n) = x_n$, so $\eqalign{ \ln(n+1+s_n) - \ln(n+s_n) &= \int_n^{n+1} \dfrac{dt}{t+s_n} = \int_n^{n+1} e^{-y(t)}\ dt\cr &< \int_n^{n+1} e^{-x_n}\ dt = e^{-x_n} = x_{n+1} - x_n\cr}$ i.e. $x_{n+1} - \ln(n+1+s_n) > x_n - \ln(n+s_n) = 0$, or $s_{n+1} > s_n$. Thus $x_n > \ln(n+s_0)$. But then $x_{n+1} - x_n = e^{-x_n} < \dfrac{1}{n + s_0}$ so $\displaystyle x_n < x_0 + \sum_{j=0}^{n-1} \frac{1}{j+s_0}$. Since both upper and lower bounds are $\ln(n) + O(1)$, we conclude that $x_n = \ln(n) + O(1)$.

  • 0
    it seems that once you solve that differential equation everything is clear. Thanks!2012-08-10
6

We examine the asymptotic behavior of $(x_n)$ inch by inch.

Step 1. We first show that $x_n \to \infty$. Note that for any $x < 0$, there exists $y > 0$ such that $x + e^{-x} = y + e^{-y}$. Thus we may assume $x_0 \geq 0$. Then $(x_n)$ is monotone increasing, thus it tends to a limit $\ell \in [0, \infty]$. Taking limit to the recursive formula, we have $\ell = \ell + e^{-\ell}$, which is valid only if $\ell = \infty$.

Step 2. Let $y_n = e^{-x_n}$. Then $(y_n)$ decreases monotonely to $0$. Also, we have

$ y_{n+1} = \exp\left(-x_{n+1}\right) = \exp\left(-x_{n}-e^{-x_n}\right) = y_n e^{-y_n}.$

Then we have

$ y_{n+1}^{-1} - y_n^{-1} = \frac{e^{y_n} - 1}{y_n} \xrightarrow[]{n\to\infty} 1. $

Thus by Cesàro-Stolz theorem, we have

$ \lim_{n\to\infty} \frac{y_n^{-1}}{n} = 1.$

This already shows that $\lim_{n\to\infty} \frac{x_n}{\log n} = 1$.

Step 3. Since

$ \frac{y_{n+1}^{-1} - y_n^{-1} - 1}{y_n} = \frac{e^{y_n} - 1 - y_n}{y_n^2} \xrightarrow[]{n\to\infty} \frac{1}{2}, $

we have

$ \frac{\left(y_{n+1}^{-1} - (n+1)\right) - \left(y_n^{-1} - n\right)}{\log(n+1) - \log n} \sim \frac{y_{n+1}^{-1} - y_n^{-1} - 1}{y_n} \xrightarrow[]{n\to\infty} \frac{1}{2}, $

where we used the fact here that $y_n \sim \frac{1}{n} \sim \log(n+1) - \log n$. Thus by Cesàro-Stolz theorem again, we have

$ \lim_{n\to\infty} \frac{y_n^{-1} - n}{\log n} = \frac{1}{2},$

which in particular implies that

$ y_n = \frac{1}{n} + O\left(\frac{\log n}{n^2}\right). $

Step 4. Finally, we have

$ \frac{y_{n+1}^{-1} - y_n^{-1} - 1 - \frac{1}{2}y_n}{y_n^2} = \frac{e^{y_n} - 1 - y_n - \frac{1}{2}y_n^2}{y_n^3} \xrightarrow[]{n\to\infty} \frac{1}{6}.$

Let $s_n = y_1 + \cdots + y_n$. Since $y_n^2 \sim \frac{1}{n^2}$,

$ \frac{\left(y_{n+1}^{-1}-(n+1)-\frac{1}{2}s_n\right) - \left(y_n^{-1}-n-\frac{1}{2}s_{n-1}\right)}{\frac{1}{n^2}} \xrightarrow[]{n\to\infty} \frac{1}{6},$

or simply

$ \left(y_{n+1}^{-1}-(n+1)-\frac{1}{2}s_n\right) - \left(y_n^{-1}-n-\frac{1}{2}s_{n-1}\right) = O\left(\frac{1}{n^2}\right).$

Thus summing,

$\begin{align*}y_n^{-1} &= n + \frac{1}{2}s_{n-1} + O(1) = n + \frac{1}{2}s_{n} + O(1) \\ &= n + \frac{1}{2}\sum_{k=1}^{n} \left( \frac{1}{k} + O\left(\frac{\log k}{k^2}\right)\right)+O(1) \\ &= n + \frac{1}{2}\log n + O(1), \end{align*}$

which finally implies the asymptotic formula

$ y_n = \frac{1}{n} - \frac{1}{2}\frac{\log n}{n^2} + O\left(\frac{1}{n^2}\right), $

or in terms of $x_n$,

$ x_n = \log n + \frac{1}{2}\frac{\log n}{n} + O\left(\frac{1}{n}\right). $

  • 0
    Having given this last result, it seems that another good question is to compute $\lim_{n\to\infty} ({x_{n}- \log n}) \frac{n}{\log n}$ :-)2012-08-10