We examine the asymptotic behavior of $(x_n)$ inch by inch.
Step 1. We first show that $x_n \to \infty$. Note that for any $x < 0$, there exists $y > 0$ such that $x + e^{-x} = y + e^{-y}$. Thus we may assume $x_0 \geq 0$. Then $(x_n)$ is monotone increasing, thus it tends to a limit $\ell \in [0, \infty]$. Taking limit to the recursive formula, we have $\ell = \ell + e^{-\ell}$, which is valid only if $\ell = \infty$.
Step 2. Let $y_n = e^{-x_n}$. Then $(y_n)$ decreases monotonely to $0$. Also, we have
$ y_{n+1} = \exp\left(-x_{n+1}\right) = \exp\left(-x_{n}-e^{-x_n}\right) = y_n e^{-y_n}.$
Then we have
$ y_{n+1}^{-1} - y_n^{-1} = \frac{e^{y_n} - 1}{y_n} \xrightarrow[]{n\to\infty} 1. $
Thus by Cesàro-Stolz theorem, we have
$ \lim_{n\to\infty} \frac{y_n^{-1}}{n} = 1.$
This already shows that $\lim_{n\to\infty} \frac{x_n}{\log n} = 1$.
Step 3. Since
$ \frac{y_{n+1}^{-1} - y_n^{-1} - 1}{y_n} = \frac{e^{y_n} - 1 - y_n}{y_n^2} \xrightarrow[]{n\to\infty} \frac{1}{2}, $
we have
$ \frac{\left(y_{n+1}^{-1} - (n+1)\right) - \left(y_n^{-1} - n\right)}{\log(n+1) - \log n} \sim \frac{y_{n+1}^{-1} - y_n^{-1} - 1}{y_n} \xrightarrow[]{n\to\infty} \frac{1}{2}, $
where we used the fact here that $y_n \sim \frac{1}{n} \sim \log(n+1) - \log n$. Thus by Cesàro-Stolz theorem again, we have
$ \lim_{n\to\infty} \frac{y_n^{-1} - n}{\log n} = \frac{1}{2},$
which in particular implies that
$ y_n = \frac{1}{n} + O\left(\frac{\log n}{n^2}\right). $
Step 4. Finally, we have
$ \frac{y_{n+1}^{-1} - y_n^{-1} - 1 - \frac{1}{2}y_n}{y_n^2} = \frac{e^{y_n} - 1 - y_n - \frac{1}{2}y_n^2}{y_n^3} \xrightarrow[]{n\to\infty} \frac{1}{6}.$
Let $s_n = y_1 + \cdots + y_n$. Since $y_n^2 \sim \frac{1}{n^2}$,
$ \frac{\left(y_{n+1}^{-1}-(n+1)-\frac{1}{2}s_n\right) - \left(y_n^{-1}-n-\frac{1}{2}s_{n-1}\right)}{\frac{1}{n^2}} \xrightarrow[]{n\to\infty} \frac{1}{6},$
or simply
$ \left(y_{n+1}^{-1}-(n+1)-\frac{1}{2}s_n\right) - \left(y_n^{-1}-n-\frac{1}{2}s_{n-1}\right) = O\left(\frac{1}{n^2}\right).$
Thus summing,
$\begin{align*}y_n^{-1} &= n + \frac{1}{2}s_{n-1} + O(1) = n + \frac{1}{2}s_{n} + O(1) \\ &= n + \frac{1}{2}\sum_{k=1}^{n} \left( \frac{1}{k} + O\left(\frac{\log k}{k^2}\right)\right)+O(1) \\ &= n + \frac{1}{2}\log n + O(1), \end{align*}$
which finally implies the asymptotic formula
$ y_n = \frac{1}{n} - \frac{1}{2}\frac{\log n}{n^2} + O\left(\frac{1}{n^2}\right), $
or in terms of $x_n$,
$ x_n = \log n + \frac{1}{2}\frac{\log n}{n} + O\left(\frac{1}{n}\right). $