2
$\begingroup$

Suppose $g$ is a function so that $g(x)\to 0$ as $x \to \infty$. Show that $e^{g(x)} = 1+O(g(x))$.

Perhaps we can use Taylors Theorem to get a form of $e^{g(x)}=1+g(x)+\frac{g(x)^2}{2!}+\frac{g(x)^3}{3!}+\dots$. Now, this is not exactly $1+O(g(x))$ but since $g \to 0$ it looks like the higher order terms go to zero faster than $g(x)$, so we can probably write this as $1+O(g(x))$ but how do we formally show this to get the stated result?

2 Answers 2

4

By applying a Taylor series expansion one has, as $u \to 0$, $$ e^u=1+O(u) $$ which gives, as $x \to \infty$, $$ e^{g(x)} = 1+O(g(x)) $$ since $g(x)\to 0$.

  • 0
    Hi, can you please see my edits on the problem? Why is it clear that we can ignore the higher order terms- what is the formal justification?2017-02-27
1

Instead of a full Taylor expansion it is enough to know that

  • $ \frac{d}{dy} e^y = 1 $ at $y=0$.
  • $ \frac{d}{dy} e^y $ is continuous at $y=0$.

Then find a sufficiently small intervals of $y$s around $0$ such that $\frac{d}{dy}e^y$ is in $[\frac12,\frac32]$ in this interval, and apply the Mean Value Theorem to conclude that $e^y$ is between $1+\frac12y$ and $1+\frac32y$ in this interval.

Then, when $x$ is large enough, $y=g(x)$ is in this interval.