I'm doing some exercises in a book on asymptotic analysis. While I think I found a solution to this problem, I'm not entirely sure if it's correct, and I want to make sure that I know what's going on.
The exercise is to show that $\int_1^x (1+t^{-1})^t dt = ex - \frac{1}{2}e \log{x} + O(1)$, for $x > 1$, with the hint to first show that $e^{-1}(1+t^{-1})^t = 1 -\frac{1}{2}t^{-1} + O(t^{-2})$, $(t > 1)$. Getting the first statement from the second is simple. Using the fact that $\log(1+t^{-1}) = t^{-1} - \frac{1/2}t^{-2} + O(t^{-3})$ for $t>1$, we have that
$e^{-1}(1+t^{-1})^t = \exp(t\log(1+t^{-1})-1)=\exp(-\frac{1}{2}t^{-1}+O(t^{-2})))$
At this point, I have to use what feels like a really handwavey argument and argue that, since $-\frac{1}{2}t^{-1}+O(t^{-2})) = 1 -\frac{1}{2}t^{-1} + O(t^{-2})$, evaluating a decreasing function (which $\exp$ is on the LHS, for the given values of $t$) on the left-hand side will preserve the relation.
Thus, my question is whether my derivation is correct, especially the last part. In addition, I'm curious why the last part is necessary- it feels like throwing away a lot of information (although I might just not yet be used to how much information to throw away in asymptotic proofs), and my only guess as to why is to get something that's easy to integrate. Is there some other reason?