197
$\begingroup$

I'm supposed to calculate:

$$\lim_{n\to\infty} e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!}$$

By using W|A, i may guess that the limit is $\frac{1}{2}$ that is a pretty interesting and nice result. I wonder in which ways we may approach it.

  • 4
    Related: http://math.stackexchange.com/questions/121099/limit-using-poisson-distribution2012-06-19
  • 0
    [This question](http://math.stackexchange.com/q/493435/8348) was merged into the present one.2013-10-01
  • 0
    [Older question, perhaps merge...] possible duplicate of [Partial sums of exponential series](http://math.stackexchange.com/questions/136996/partial-sums-of-exponential-series)2015-01-23
  • 0
    Also known as **Dobiński's formula**2017-06-14
  • 1
    What is W|A , in the question. It says "by using W|A".2018-10-28

9 Answers 9

1

On this page there is a nice collection of evidence.

I add another proof which also uses the Stirling formula.

$\displaystyle e^{-n}\sum\limits_{k=0}^n\frac{n^k}{k!} = e^{-n}\sum\limits_{k=0}^n\frac{k^k (n-k)^{n-k}}{k!(n-k)!} \hspace{4cm}$ e.g. here

$\displaystyle \lim\limits_{n\to\infty} e^{-n}\sum\limits_{k=1}^{n-1}\frac{e^k e^{n-k}}{\sqrt{2\pi k (1+\mathcal{O}(1/k))}\sqrt{2\pi (n-k)(1+\mathcal{O}(1/(n-k)))}} $

$\displaystyle = \lim\limits_{n\to\infty} \frac{1}{2\pi}\frac{1}{n}\sum\limits_{k=1}^{n-1}\frac{1}{\sqrt{\frac{k}{n}\left(1-\frac{k}{n}\right)}} =\frac{1}{2\pi} \int\limits_0^1\frac{dx}{\sqrt{x(1-x)}}=\frac{\Gamma(\frac{1}{2})^2}{2\pi~\Gamma(1)} = \frac{1}{2}$

  • 0
    (+1) Interesting idea.2018-11-26
  • 0
    @user1357113 : That's kind of you, thanks. :)2018-11-27
128

The probabilistic way:

This is $P[N_n\leqslant n]$ where $N_n$ is a random variable with Poisson distribution of parameter $n$. Hence each $N_n$ is distributed like $X_1+\cdots+X_n$ where the random variables $(X_k)$ are independent and identically distributed with Poisson distribution of parameter $1$.

By the central limit theorem, $Y_n=\frac1{\sqrt{n}}(X_1+\cdots+X_n-n)$ converges in distribution to a standard normal random variable $Z$, in particular, $P[Y_n\leqslant 0]\to P[Z\leqslant0]$.

Finally, $P[Z\leqslant0]=\frac12$ and $[N_n\leqslant n]=[Y_n\leqslant 0]$ hence $P[N_n\leqslant n]\to\frac12$, QED.


The analytical way, completing your try:

Hence, I know that what I need to do is to find $\lim\limits_{n\to\infty}I_n$, where $$ I_n=\frac{e^{-n}}{n!}\int_{0}^n (n-t)^ne^tdt.$$

To begin with, let $u(t)=(1-t)e^t$, then $I_n=\dfrac{e^{-n}n^n}{n!}nJ_n$ with $$ J_n=\int_{0}^1 u(t)^n\mathrm dt. $$ Now, $u(t)\leqslant\mathrm e^{-t^2/2}$ hence $$ J_n\leqslant\int_0^1\mathrm e^{-nt^2/2}\mathrm dt\leqslant\int_0^\infty\mathrm e^{-nt^2/2}\mathrm dt=\sqrt{\frac{\pi}{2n}}. $$ Likewise, the function $t\mapsto u(t)\mathrm e^{t^2/2}$ is decreasing on $t\geqslant0$ hence $u(t)\geqslant c_n\mathrm e^{-t^2/2}$ on $t\leqslant1/n^{1/4}$, with $c_n=u(1/n^{1/4})\mathrm e^{-1/(2\sqrt{n})}$, hence $$ J_n\geqslant c_n\int_0^{1/n^{1/4}}\mathrm e^{-nt^2/2}\mathrm dt=\frac{c_n}{\sqrt{n}}\int_0^{n^{1/4}}\mathrm e^{-t^2/2}\mathrm dt=\frac{c_n}{\sqrt{n}}\sqrt{\frac{\pi}{2}}(1+o(1)). $$ Since $c_n\to1$, all this proves that $\sqrt{n}J_n\to\sqrt{\frac\pi2}$. Stirling formula shows that the prefactor $\frac{e^{-n}n^n}{n!}$ is equivalent to $\frac1{\sqrt{2\pi n}}$. Regrouping everything, one sees that $I_n\sim\frac1{\sqrt{2\pi n}}n\sqrt{\frac\pi{2n}}=\frac12$.

Moral: The probabilistic way is shorter, easier, more illuminating, and more fun.

Caveat: My advice in these matters is, clearly, horribly biased.

123

Edited. I justified the application of the dominated convergence theorem.

By a simple calculation,

$$ \begin{align*} e^{-n}\sum_{k=0}^{n} \frac{n^k}{k!} &= \frac{e^{-n}}{n!} \sum_{k=0}^{n}\binom{n}{k} n^k (n-k)! \\ (1) \cdots \quad &= \frac{e^{-n}}{n!} \sum_{k=0}^{n}\binom{n}{k} n^k \int_{0}^{\infty} t^{n-k}e^{-t} \, dt\\ &= \frac{e^{-n}}{n!} \int_{0}^{\infty} (n+t)^{n}e^{-t} \, dt \\ (2) \cdots \quad &= \frac{1}{n!} \int_{n}^{\infty} t^{n}e^{-t} \, dt \\ &= 1 - \frac{1}{n!} \int_{0}^{n} t^{n}e^{-t} \, dt \\ (3) \cdots \quad &= 1 - \frac{\sqrt{n} (n/e)^n}{n!} \int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du. \end{align*}$$

We remark that

  1. In $\text{(1)}$, we utilized the famous formula $ n! = \int_{0}^{\infty} t^n e^{-t} \, dt$.
  2. In $\text{(2)}$, the substitution $t + n \mapsto t$ is used.
  3. In $\text{(3)}$, the substitution $t = n - \sqrt{n}u$ is used.

Then in view of the Stirling's formula, it suffices to show that

$$\int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du \xrightarrow{n\to\infty} \sqrt{\frac{\pi}{2}}.$$

The idea is to introduce the function

$$ g_n (u) = \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \mathbf{1}_{(0, \sqrt{n})}(u) $$

and apply pointwise limit to the integrand as $n \to \infty$. This is justified once we find a dominating function for the sequence $(g_n)$. But notice that if $0 < u < \sqrt{n}$, then

$$ \log g_n (u) = n \log \left(1 - \frac{u}{\sqrt{n}} \right) + \sqrt{n} u = -\frac{u^2}{2} - \frac{u^3}{3\sqrt{n}} - \frac{u^4}{4n} - \cdots \leq -\frac{u^2}{2}. $$

From this we have $g_n (u) \leq e^{-u^2 /2}$ for all $n$ and $g_n (u) \to e^{-u^2 / 2}$ as $n \to \infty$. Therefore by dominated convergence theorem and Gaussian integral,

$$ \int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du = \int_{0}^{\infty} g_n (u) \, du \xrightarrow{n\to\infty} \int_{0}^{\infty} e^{-u^2/2} \, du = \sqrt{\frac{\pi}{2}}. $$

  • 2
    Your second equation is closely related to [this question](http://math.stackexchange.com/questions/71447/large-n-asymptotic-of-int-0-infty-left-1-x-n-rightn-1-exp-x) (which I answered).2012-06-19
47

Integration by parts yields $$ \frac{1}{k!}\int_x^\infty e^{-t}\,t^k\,\mathrm{d}t=\frac{1}{k!}x^ke^{-x}+\frac{1}{(k-1)!}\int_x^\infty e^{-t}\,t^{k-1}\,\mathrm{d}t\tag{1} $$ Iterating $(1)$ gives $$ \frac{1}{n!}\int_x^\infty e^{-t}\,t^n\,\mathrm{d}t=e^{-x}\sum_{k=0}^n\frac{x^k}{k!}\tag{2} $$ Thus, we get $$ e^{-n}\sum_{k=0}^n\frac{n^k}{k!}=\frac{1}{n!}\int_n^\infty e^{-t}\,t^n\,\mathrm{d}t\tag{3} $$ Now, I will reproduce part of the argument I give here, which develops a full asymptotic expansion. Additionally, I include some error estimates that were previously missing. $$ \begin{align} \int_n^\infty e^{-t}\,t^n\,\mathrm{d}t &=n^{n+1}e^{-n}\int_0^\infty e^{-ns}\,(s+1)^n\,\mathrm{d}s\\ &=n^{n+1}e^{-n}\int_0^\infty e^{-n(s-\log(1+s)}\,\mathrm{d}s\\ &=n^{n+1}e^{-n}\int_0^\infty e^{-nu^2/2}\,s'\,\mathrm{d}u\tag{4} \end{align} $$ where $t=n(s+1)$ and $u^2/2=s-\log(1+s)$.

Note that $\frac{ss'}{1+s}=u$; thus, when $s\ge1$, $s'\le2u$. This leads to the bound $$ \begin{align} \int_{s\ge1} e^{-nu^2/2}\,s'\,\mathrm{d}u &\le\int_{3/4}^\infty e^{-nu^2/2}\,2u\,\mathrm{d}u\\ &=\frac2ne^{-\frac98n}\tag{5} \end{align} $$ $(5)$ also show that $$ \int_{s\ge1}e^{-nu^2/2}\,\mathrm{d}u\le\frac2ne^{-\frac98n}\tag{6} $$

For $|s|<1$, we get $$ u^2/2=s-\log(1+s)=s^2/2-s^3/3+s^4/4-\dots\tag{7} $$ We can invert the series to get $s'=1+\frac23u+O(u^2)$. Therefore, $$ \begin{align} \int_0^\infty e^{-nu^2/2}\,s'\,\mathrm{d}u &=\int_{s\in[0,1]} e^{-nu^2/2}\,s'\,\mathrm{d}u+\color{red}{\int_{s>1} e^{-nu^2/2}\,s'\,\mathrm{d}u}\\ &=\int_0^\infty\left(1+\frac23u\right)e^{-nu^2/2}\,\mathrm{d}u-\color{darkorange}{\int_{s>1}\left(1+\frac23u\right)e^{-nu^2/2}\,\mathrm{d}u}\\ &+\int_0^\infty e^{-nu^2/2}\,O(u^2)\,\mathrm{d}u-\color{darkorange}{\int_{s>1} e^{-nu^2/2}\,O(u^2)\,\mathrm{d}u}\\ &+\color{red}{\int_{s>1} e^{-nu^2/2}\,s'\,\mathrm{d}u}\\ &=\sqrt{\frac{\pi}{2n}}+\frac2{3n}+O\left(n^{-3/2}\right)\tag{8} \end{align} $$ The red and orange integrals decrease exponentially by $(5)$ and $(6)$.

Plugging $(8)$ into $(4)$ yields $$ \int_n^\infty e^{-t}\,t^n\,\mathrm{d}t=\left(\sqrt{\frac{\pi n}{2}}+\frac23\right)\,n^ne^{-n}+O(n^{n-1/2}e^{-n})\tag{9} $$ The argument above can be used to prove Stirling's approximation, which says that $$ n!=\sqrt{2\pi n}\,n^ne^{-n}+O(n^{n-1/2}e^{-n})\tag{10} $$ Combining $(9)$ and $(10)$ yields $$ \begin{align} e^{-n}\sum_{k=0}^n\frac{n^k}{k!} &=\frac{1}{n!}\int_n^\infty e^{-t}\,t^n\,\mathrm{d}t\\ &=\frac12+\frac{2/3}{\sqrt{2\pi n}}+O(n^{-1})\tag{11} \end{align} $$

  • 1
    Would the downvoter care to comment ([he asked, expecting the answer "no"](http://www.montypython.net/scripts/cheese.php))?2018-05-12
  • 0
    hey rob, sadly the link in your answer is dead :(2018-06-10
  • 0
    Yeah too many downvoters here :/ I upvoted this. Very Nice answer.2018-06-29
27

If you'd like to see formal solution using calculus methods check this article http://www.emis.de/journals/AMAPN/vol15/voros.pdf

  • 3
    I wish you had copied the article here. It seemed small enough. References have a habit of pointing to NULL over time.2015-08-22
23

The sum is related to the partial exponential sum, and thus to the incomplete gamma function, $$\begin{eqnarray*} e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!} &=& e^{-n} e_n(n) \\ &=& \frac{\Gamma(n+1,n)}{\Gamma(n+1)}, \end{eqnarray*}$$ since $e_n(x) = \sum_{k=0}^n x^k/k! = e^x \Gamma(n+1,x)/\Gamma(n+1)$. But $$\begin{eqnarray*} \Gamma(n+1,n) &=& \sqrt{2\pi}\, n^{n+1/2}e^{-n}\left(\frac{1}{2} + \frac{1}{3}\sqrt{\frac{2}{n\pi}} + O\left(\frac{1}{n}\right) \right). \end{eqnarray*}$$ The first term in the asymptotic expansion for $\Gamma(n+1,n)$ can be found by applying the saddle point method to $$\Gamma(n+1,n) = \int_n^\infty dt\, t^n e^{-t}.$$ The higher order terms are in principle straightforward to compute. Using Stirling's approximation, we find $$e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!} = \frac{1}{2} + \frac{1}{3}\sqrt{\frac{2}{n\pi}} + O\left(\frac{1}{n}\right).$$ Thus, the limit is $1/2$, as found by @sos440 and @robjohn. This limit is a special case of DLMF 8.11.13.

I just noticed a comment that suggests this be done using high school level math. If this is a standard exercise at your high school, maybe they covered the incomplete gamma function! ;-)

  • 2
    (+1) I derived this asymptotic expansion [here](http://www.whim.org/nebula/math/gammaratio.html) in answer to a question on sci.math. I computed a few terms past $\frac12$: $$ \frac12+\frac{1}{\sqrt{2\pi n}}\left(\frac23-\frac{23}{270n}+\frac{23}{3024n^2}+\dots\right) $$2012-06-20
  • 0
    @robjohn: Thanks for the link, I'll have a look. By the way, I voted up your nice solution a couple of hours ago. I like your short and sweet derivation of (3).2012-06-20
  • 0
    @robjohn As steven gregory says above "I wish you had copied the article here. It seemed small enough. References have a habit of pointing to NULL over time" . Please, could you upload somewhere again? I'm curious about it.2017-07-04
23

$\newcommand{\+}{^{\dagger}} \newcommand{\angles}[1]{\left\langle #1 \right\rangle} \newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace} \newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack} \newcommand{\dd}{{\rm d}} \newcommand{\isdiv}{\,\left.\right\vert\,} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\equalby}[1]{{#1 \atop {= \atop \vphantom{\huge A}}}} \newcommand{\expo}[1]{\,{\rm e}^{#1}\,} \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{{\rm i}} \newcommand{\imp}{\Longrightarrow} \newcommand{\ket}[1]{\left\vert #1\right\rangle} \newcommand{\pars}[1]{\left( #1 \right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,#2\,}\,} \newcommand{\sech}{\,{\rm sech}} \newcommand{\sgn}{\,{\rm sgn}} \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\verts}[1]{\left\vert #1 \right\vert} \newcommand{\yy}{\Longleftrightarrow}$ \begin{align}&\color{#00f}{ \lim_{n \to \infty}\bracks{\expo{-n}\sum_{k = 0}^{n}{n^{k} \over k!}}} \\[3mm]&=\lim_{n \to \infty}\bracks{\expo{-n}\sum_{k = 0}^{n} \exp\pars{k\ln\pars{n} - \ln\pars{k!}}} \\[3mm]&= \lim_{n \to \infty}\braces{\expo{-n}\sum_{k = 0}^{n} \exp\pars{n\ln\pars{n} - \ln\pars{n!} - {1 \over 2n}\bracks{k - n}^{2}}} \\[3mm]&= \lim_{n \to \infty}\braces{\expo{-n}\,{n^{n} \over n!}\sum_{k = 0}^{n} \exp\pars{-{1 \over 2n}\bracks{k - n}^{2}}} \\[3mm]&= \lim_{n \to \infty}\braces{{\expo{-n}n^{n} \over n!}\int_{0}^{n} \exp\pars{-{1 \over 2n}\bracks{k - n}^{2}}\,\dd k} \\[3mm]&= \lim_{n \to \infty}\bracks{{\expo{-n}n^{n} \over n!}\int_{-n}^{0} \exp\pars{-\,{k^{2} \over 2n}}\,\dd k} = \lim_{n \to \infty}\bracks{{\expo{-n}n^{n} \over n!}\,\root{2n} \int_{-\root{n}/2}^{0}\exp\pars{-k^{2}}\,\dd k} \\[3mm]&= \lim_{n \to \infty}\bracks{{\root{2}n^{n + 1/2}\expo{-n} \over n!} \int_{-\infty}^{0}\exp\pars{-k^{2}}\,\dd k} = \lim_{n \to \infty}\bracks{{\root{2}n^{n + 1/2}\expo{-n} \over n!} \,{\root{\pi} \over 2}} \\[3mm]&= \half\,\lim_{n \to \infty}\bracks{{\root{2\pi}n^{n + 1/2}\expo{-n} \over n!}} =\color{#00f}{\Large\half} \end{align}

  • 16
    The second equal sign is a complete mystery. The passage from a sum to an integral also needs justification but it probably holds.2014-03-31
  • 9
    Would you care answering @Did 's question regarding the second equal sign? I am miffed as well. Thank you, Felix Marin.2015-05-11
  • 4
    @Did's question has to be answered otherwise this seems to be an suitable answer2015-10-04
  • 3
    yeah for me too, can you please answer @Did 's question?2016-03-01
  • 0
    Felix, I'm just curious ... How did you arrive at that second equality? I don't see it.2017-05-16
  • 0
    @MarkViola Note that $n^{k}/k!$ has a "sharp" maximum at $k = n$. Roughly speaking, it's a Gaussian. But the sum is just up to $n$. So, it was pretty obvious that the final result must be $1/2$. When I "went from the sum to the integral" I closely follows $\texttt{Laplace Method for Sums}$ as explained in the book $$ \color{#00f}{Analytic\ Combinatorics\ by\ Philippe\ Flajolet\ and\ Robert\ Sedgewick,\ Cambridge\ University\ Press\ 2009} $$. $\left(\,\color{#f00}{\texttt{see page 761}}\,\right)$. It's clearer whenever we switch the origin with $k \mapsto n - k$. I didn't do it.2017-05-19
  • 1
    Is this supposed to justify the various unsubstantiated claims this post relies on? It does not.2018-02-08
18

I'll give you two hints:

1) Poisson distributions;

2) Central limit theorem

I am not aware of any other technique to solve the problem, so any other answer is appreciated.

  • 3
    In what country's education system is this a high school exercise?! I ask sincerely because I'm interested in that, and my experience is that even some of the most advanced mathematically education systems don't reach exercises as the one you're proposing...not even close. Of course, I don't know all the education systems (not even close, again), but the expression within the sum seems to be pretty tough...it reminds the series for the exponential function, but that n repeating in the numerator and upper limit...are you sure the expression is accurate?2012-06-19
  • 2
    This problem is definitely not at high-school level. Without the hints I gave, I doubt most university students (even graduates) would solve it in reasonable time.2012-06-19
  • 0
    The solution using Poisson distribution was also given here: [Limit using Poisson distribution](http://math.stackexchange.com/questions/121099/limit-using-poisson-distribution)2012-06-19
  • 0
    We actually made this exercice in our probability course.2018-11-26
12

I do not know how much this will help you.

For a given $n$, the result is $\dfrac{\Gamma(n+1,n)}{n\ \Gamma(n)}$ which has a limit equal to $\dfrac12$ as $n\to\infty$.