16
$\begingroup$

I am in calc 1 and we have just learned the epsilon-delta definition of a limit and I (on my own) wanted to try and use this methodology in order to prove $(e^x-1)/x = 1$ (one of the equivalencies), along with $\displaystyle \frac {\sin(x)}{x} = 1$, that the proof just told us "was so."

I do not know how to put the happy little math symbols in this website so I'm going to upload a picture of my work. Now, I understand how to apply the epsilon-delta definition of the limit for some easy problems, even for some complex functions where the numbers simply "fall out," but what do I do with the the $|f(x)-L|<\epsilon$ after I've made it be $|(e^x-1-x)/x| < \epsilon$?

I understand that I basically need to get $|(e^x-1-x)/x|$ to become equivalent to $|x|$ but how do I do this? Is this factorable?

And if this kind of easy problem is difficult for me, does this mean that I do have what it takes to become a math major? I really love this kind of problem-solving but sometimes I just don't get the answer. Thanks!

http://tinypic.com/r/wiae6f/7

The above is my problem.

  • 6
    $I$ hope that you can see from the answers that your difficulty has absolutely nothing to do with whether you have what it takes to be a math major. "$I$ really love this kind of problem-solving" does answer the question of whether you have what it takes.2011-06-02

7 Answers 7

8

Another approach, harder to handle rigorously than any of the ones suggested so far, is to do it the way Euler did, essentially by defining $e$ as $\lim_{n\to\infty}\left(1+\frac{1}{n}\right)^n.$ (But then we would in particular want to prove that the limit exists, which is not easy.)

Now imagine that $n$ is large, and let $h=1/n$. Then $e^h$ should be about $1+1/n$, and the rest follows. But the details, such as making precise the weaselly "should be about $1+1/n$," are not easy. In particular, we would have to define precisely the general exponential function.

So unless we fill in a lot of detail, the above idea involves quite vigorous hand waving, diametrically opposite to the epsilon-delta approach. However, the idea has useful intuitive content.

4

Suppose that somehow, we know the derivative of $e^x$ is itself, that is $\frac{d}{dx}e^x=e^x$. (This could follow from the power series definition)

Then, we have that by the definition of the derivative $e^x=\lim_{h=\rightarrow 0} \frac{e^{x+h}-e^x}{h}$ and after dividing by $e^x$ we get $\lim_{h=\rightarrow 0} \frac{e^{h}-1}{h}=1.$

Again it really depends on the definition you are starting from.

  • 2
    It would seem that this begs the question. At the very least, it's dissatisfying to appeal to the derivative to answer a question totally primitive to these concepts. (Even given the power series for $e^x$, we should't know anything about the derivative).2011-06-02
2

Are you allowed to use $e^x = 1 + x + x^2/2! + \cdots$? If so, then show that $1+x \leq e^x \leq 1 + x + x^2$ for all $x$ in a neighborhood of $0$.

EDIT (elaborating): Assuming the definition $e^x = \sum\nolimits_{n = 0}^\infty {\frac{{x^n }}{{n!}}}$, you can show that $ 1+x \leq e^x \leq 1 + x + x^2 $ holds for all $x$ in some $\delta$-neighborhood of $0$, very simply as follows. On the one hand, $ e^x -1 - x = x^2\bigg(\frac{1}{{2!}} + \frac{x}{{3!}} + \frac{{x^2 }}{{4!}} + \cdots \bigg), $ from which the first inequality is immediately seen to hold; on the other hand, $ e^x -1 - x - x^2 = -x^2 \bigg(\frac{1}{{2!}} - \frac{x}{{3!}} - \frac{{x^2 }}{{4!}} - \cdots \bigg), $ from which the second inequality is immediately seen to hold. Indeed, note that for any $r > 0$ (as small as we wish), it holds $ \sup _{|x| \le r} \Big(\Big|\frac{x}{{3!}}\Big| + \Big|\frac{{x^2 }}{{4!}}\Big| + \cdots \Big) = \frac{r}{{3!}} + \frac{{r^2 }}{{4!}} + \cdots \le r + r^2 + \cdots = \frac{r}{{1 - r}}. $

  • 0
    @Arthur Colle: You can ignore my last comment; just consider the first sentence in my answer.2011-06-02
1

I suggest changing the limit of $x$ tending to zero to some function of $x$ tending to infinity thru a mathematically manipulation. For example if we use $1/h = e^x-1$ then $e^x = 1+1/h$.

Remembering we want to find $\lim_{x\to 0} (e^x-1)/x$. If now we use log to help with our manipulation $\ln (e^x)= \ln(1+ 1/h)$

now as $x$ tending to zero $h$ tending to infinity as it is essentially has a inverse effect

now substituting into our original limit we get

$\lim_{h\to \infty}{1/h/\ln(1+1/h)} = 1/\ln(1+1/h)^h)$ and since by definition $(1+1/h)^h = e then taking logs of both sides \ln(e) = 1$

therefore $\lim_{h\to \infty}(1/1)$ is simply one, therefore $\lim_{x\to0} (e^x-1)/x$ must also be one. QED

I'm no math whiz and this may not withstand rigorous mathematical proof but hopefully it is a step in the right direction Cheers

  • 0
    leo, getting closer, still not there.2012-10-18
0

When I teach these topics, I note that $(2^x-1)/x$ seems to be about $0.7$ for small values of $x$, while $(3^x-1)/x$ seems to be about $1.1$ for small values of $x$, so it stands to reason that there's some number between $2$ and $3$, let's call it $e$, such that $(e^x-1)/x$ tends to $1$ as $x\to0$. Not very rigorous, I know, and no deltas-and-epsilons, but as others have mentioned you have to start with $\it some$ definition of $e$ to even ask the question.

  • 0
    That's essentially equivalent to Euler's definition of $e$, by the way :)2012-10-16
0

The only way I see this question being attackable is to know what the derivative of $e^x$ is. In many ways $e^x$ is defined by this property. In this case I can offer two solutions.

1) Suppose you know that $\frac{d}{dx} e^x = e^x$. Then we prove that $1 + x \leq e^x \leq 1 + x + x^2$ for $x$ sufficiently close to 0. To see this, define $f(x) = e^x - 1 - x$. Since $f(0) = 0$, it suffices to show that $f(0)$ is a local minimum. If $x \geq 0$, then f'(x) = e^x - 1, which is positive, so $f$ is increasing for $x \geq 0$. On the other hand if $x < 0$, then f' is negative, so $f$ is decreasing to the left of zero. Hence $0$ is a local minimum for $f$ and so $f$ is positive for a small neighborhood around 0. Now, let $g(x) = e^x - 1 - x - x^2$. Again, $g(0) = 0$. It will now suffice to show that 0 is a local maximum for $g$. To see this, we have g'(x) = e^x - 1 - 2x. It is not so clear whether this is positive or negative for $x$ close to 0, so we take the derivative again to get g''(x) = e^x - 2, which is negative for $x < ln 2$. Hence $g$ is concave down in a neighborhood around 0, which implies that 0 is a local maximum. Hence we have $g(x) \leq 0$ in a neighborhood around 0. Combining the results for $f,g$ we get $1 + x \leq e^x \leq 1 + x + x^2$. This yields that $\lim_{x \rightarrow 0} x/x \leq \lim_{x \rightarrow 0} (e^x - 1)/x \leq \lim_{x \rightarrow 0} (x + x^2)/x$, so by the squeeze theorem the desired limit is 1.

2) If you know L'Hospital's Rule, then the result follows immediately since $\lim_{x \rightarrow 0} (e^x - 1)/x = \lim_{x \rightarrow 0} e^x/1 = e^0 = 1$.

  • 1
    It is worth mentioning that for l'Hopitals rule you also need to know the derivative of $e^x$. A bit of care is needed. For example to use L'Hopitals rule to prove that $\frac{\sin(x)}{x}\rightarrow 1$ as $x\rightarrow 0$ is not always feasible (unless we start from power series for $\sin(x)$) because we need to know the derivative of $\sin(x)$, and to take the derivative of in the first place you usually use that limit.2011-06-02
0

I don't know if you also wanted information on $\lim_{x\to0}\sin x/x$, but the standard way to make that one plausible is with a diagram. Draw the circle of radius $1$ centered at the origin, locate the point $P=(\cos x,\sin x)$, then $\sin x$ is the distance from $P$ to the $x$-axis if you go straight down, while $x$ is the distance from $P$ to the $x$-axis if you go along the circle (I guess I'm assuming $P$ is in the first quadrant). It is plausible that as $x$ approaches zero the ratio of these two lengths approaches $1$.