4
$\begingroup$

Possible Duplicates:
Finding the limit of $n/\sqrt[n]{n!}$
How come such different methods result in the same number, $e$?

I've seen this formula several thousand times: $$e=\lim_{x\to \infty} \left(1+\frac{1}{x}\right)^x $$

I know that it was discovered by Bernoulli when he was working with compound interest problems, but I haven't seen the proof anywhere. Does anyone know how to rigorously demonstrate this relationship?

EDIT: Sorry for my lack of knowledge in this, I'll try to state the question more clearly. How do we prove the following?

$$ \lim_{x\to \infty} \left(1+\frac{1}{x}\right)^x = \sum_{k=0}^{\infty}\frac{1}{k!}$$

  • 4
    What definition of $e$ are you starting with?2011-07-28
  • 0
    You can show that $\displaystyle\sum_{i=0}^{\infty}\dfrac{1}{i!}$ and your sequence converge to the same limit...2011-07-28
  • 0
    Thanks, I updated my question.2011-07-28
  • 0
    Related: http://planetmath.org/?op=getobj&from=objects&id=101702011-07-28
  • 0
    possible duplicate of [Finding the limit of $n/\sqrt\[n\]{n!}$](http://math.stackexchange.com/q/28476)2011-07-28
  • 0
    Seems significantly different from the "exact duplicate question," that one has not much connection with the series for $e$.2011-07-29
  • 0
    I tend to agree that it is a little harsh to consider this an exact duplicate, though there is certainly much common ground. It also occurs to me that the way Bernoulli proved it, and what might be considered rigorous by modern standards, are likely to be different.2011-07-30

6 Answers 6

2

We know that $\ln\left(\left(1+\frac{1}{n}\right)^n\right)= n \ln \left(1 + \frac{1}{n}\right)$

Now suppose that $x = \frac{1}{n}$. Thus,

$n \ln \left(1 + \frac{1}{n}\right) = \displaystyle \left(\frac{1}{x} \ln\left( 1 + x\right)\right) = \displaystyle \left(\frac{1}{x} (\ln\left( 1 + x\right) - \ln 1)\right)$

Now if we send $x$ to 0, we see that $\displaystyle\lim_{x \rightarrow 0}\left(\frac{\ln\left( 1 + x\right) - \ln 1}{x}\right)$, which equals the derivative of $\ln$ at x = 1, which is 1.

Thus, we know that $\displaystyle\lim_{n \rightarrow \infty}\left(\ln\left(\left(1+\frac{1}{n}\right)^n\right)\right) = \lim_{x \rightarrow 0}\left(\frac{\ln\left( 1 + x\right) - \ln 1}{x}\right) = 1$

Now, using the fact that $e^{\ln x} = x$, we know that $\lim_{n \rightarrow \infty}\left(\left(1 + \frac{1}{n}\right)^n\right) = e^{\ln\left(\lim_{n \rightarrow \infty}\left(\left(1 + \frac{1}{n}\right)^n\right)\right)} = e^{\lim_{n \rightarrow \infty}\left(\ln\left(\left(1 + \frac{1}{n}\right)^n\right)\right)} = e^{1} = e$.

9

$$\mathrm {Log}\left(\displaystyle\lim_{x\rightarrow\infty} \left(1 + \frac{1}{x}\right)^{x}\right) = \displaystyle\lim_{x\rightarrow 0}\text{ } \mathrm {Log} \left((1 + x)^{\frac{1}{x}}\right) = \lim_{x\rightarrow 0} \frac{\mathrm {Log}(1+x)}x = \lim_{x\rightarrow 0} \text{ }\displaystyle\sum_{i=0}^{\infty} \frac{x^i}{i+1} = 1.$$

  • 3
    Rigorous is a dirty word2011-07-28
  • 0
    or towards the end $\lim_{x \to 0} \log(1+x)/x = \lim_{x \to 0} \frac{1}{1+x} / 1 = 1$2011-07-28
  • 2
    @Henry: You write it as if you were applying de l'Hôpital. You can simply plug in the definition of the derivative and use the fact that $\log 1 = 0$.2011-07-28
  • 0
    Sorry for the vagueness of my question, I updated it.2011-07-28
6

It does rather matter how you want to define $e$ in the first place. One way to define $e$ is to prove that the sequence whose $n$-th term is $(1 + \frac{1}{n})^n$ is increasing, but bounded above, and therefore converges to its least upper bound, which may be defined as $e$. More generally, we may define $e^x$ as $\lim_{n \to \infty} (1 + \frac{x}{n})^n$ for any real $x$ ( and the limit always exists). Then it's easy to verify from this definition that $e^{x+y} = e^{x}.e^{y}$ for all $x,y \in \mathbb{R}$. With this approach the Bernoulli representation of $e$ is almost a non-issue.

The very definition $(1 + \frac{1}{x})^{x}$ for non-integral $x$ (as $\exp(x \log(1 + \frac{1}{x}))$), presupposes that $e$ (and the natural logarithm) have already been defined.

Another way to define the function $e^x$ from first principles, adopted, for example, in Spivak's "Calculus"), is as the inverse function of the logarithm, where $\log(x)$ is defined as $\int_{1}^{x}\frac{1}{t} dt$ for $x >0$. Then the fundamental theorem of Calculus gives $\log'(x) = \frac{1}{x}$ for $x >0$, and if we define the exponential function as the inverse of the logarithm function, it is its own derivative. Since this function is always positive, the exponential function is increasing everywhere. The mean value theorem tells us that $x\log(1 + \frac{1}{x}) = \frac{1}{\theta}$ for some $\theta \in (1,1+\frac{1}{x})$ when $x >0.$ As $x \to \infty$, we see that $\theta \to 1$. Since $e^{x}$ is differentiable everywhere, it is certainly continuous, so that as $x \to \infty$, $\exp(x \log(1 + \frac{1}{x})) \to \exp(1) = e.$

NOTE ADDED: Since the question has been rephrased taking $e = \sum_{i=0}^{\infty} \frac{1}{i!}$ after the above was written, I add that the second approach here does that, since the fact that the exponential function is its own derivative shows that its Maclaurin series is the expected $e^{x} = \sum_{n=0}^{\infty} \frac{x^n}{n!}$, and that this converges for all real $x$ using the standard form for the remainder in Taylor's theorem (as, eg, in Spivak's book).

4

Take a look at http://www.whim.org/nebula/math/eseries.html. On that page, it is shown that $\left(1+\frac{1}{n}\right)^n\le e$. Then, it is shown, using the binomial theorem, that $e-\left(1+\frac{1}{n}\right)^n$ can be made as small as desired by choosing $n$ large enough.

  • 3
    It is usually best if you provide more meat to an answer that just a link.2011-07-28
  • 1
    @Mariano Suárez-Alvarez: This answer is now fully fleshed out and generalized in [my answer to "Combinatorial proof"](http://math.stackexchange.com/questions/54448/combinatorial-proof/54499#54499).2011-08-01
  • 0
    @BillDubuque: "contest" in the sense of an exam at a school. We'll put things on hold for a bit until we can get more info. If not, we'll put it back as it was.2014-07-15
4

Well, the problem with $e$ is that there are many different ways of defining it. But this is another way.

Suppose the limit exists, and call it $L$.

$\log L = \lim x \log \left( \dfrac{x + 1}{x} \right) = \lim \dfrac{ \log \frac{x + 1}{x}}{\frac{1}{x}} = \lim \dfrac{ \frac{x}{x+1} \cdot (-x^{-2} \cdot (x + 1) + x^{-1})}{ -x^{-2} } = \; \;...$

$... \;= \lim \dfrac{ \frac{x}{x+1} \cdot \frac{-1}{x^2} }{\frac{-1}{x^2}} = 1$

So $L = e^1$.

For a different approach.

  • 0
    Replace $x$ by $1/x$ and plug in the definition of the derivative in your second equation.2011-07-28
3

From the binomial theorem

$$\left(1+\frac{1}{n}\right)^n = \sum_{k=0}^n {n \choose k} \frac{1}{n^k} = \sum_{k=0}^n \frac{n}{n}\frac{n-1}{n}\frac{n-2}{n}\cdots\frac{n-k+1}{n}\frac{1}{k!}$$

but as $n \to \infty$, each term in the sum increases towards a limit of $\frac{1}{k!}$, and the number of terms to be summed increases so

$$\left(1+\frac{1}{n}\right)^n \to \sum_{k=0}^\infty \frac{1}{k!}.$$

  • 1
    -1; this is not a complete argument. It is false in general that if $a_{n,k}$ is a sequence of sequences such that $s_n = \sum_k a_{n,k}$ and $\lim_{n \to \infty} a_{n,k} = b_k$, then $\lim_{n \to \infty} s_n = \sum_k b_k$. An easy counterexample is to take $a_{n,k} = 0$ if $n \neq k$ and $1$ otherwise. More is needed. (You can fix this argument with monotone convergence for sequences, but it would probably be easier to split the sum into a principal part and a tail.)2011-07-29
  • 3
    No, it is not complete, but it is salvageable. Each term of the sequence is less that $\sum_{k=0}^{\infty} \frac{1}{k!}$. On the other hand, for any given $N$, we can make the sum of the first $N+1$ terms of the binomial expansion for $(1+\frac{1}{n})^n$ as close as we like to $\sum_{k=0}^{N} \frac{1}{k!}$ by making $n$ large enough. Hence the supremum of $\{(1 +\frac{1}{n})^n$ is $\sum_{k=0}^{\infty} \frac{1}{k!}$, so this is the limit of the increasing sequence.2011-07-29
  • 0
    @Qiaochu Yuan: Your counter-example does not have the the property that mine does that each term in the sum *increases* towards a limit, and the number of terms to be summed increases.2011-07-29
  • 0
    @Henry: yes, I know, but probably the OP doesn't know that this makes a difference.2011-07-29
  • 2
    @Qiaochu I respectfully disagree. To me, the OP's questions says "Can someone explain the mystery of why this sequence of products represents the same number as this infinite sum?" Henry's answer is the most direct explanation, and the OP found it useful and accepted it. It is also my favorite, and I upvoted it.2011-07-29
  • 0
    Of course, Henry sidesteps the issue of convergence. But to me, this is no more serious than not proving that $\lim_{x\to \infty} (1+{1/x})^x$ exists, that $\sum_{k=0}^{\infty}1/{k!}$ converges, or that the log function is continuous. We take such things for granted, understanding that more details would be needed in formal writing.2011-07-29