12
$\begingroup$

Show using the Poisson distribution that

$$\lim_{n \to +\infty} e^{-n} \sum_{k=1}^{n}\frac{n^k}{k!} = \frac {1}{2}$$

  • 1
    Second hint, to supplement the Poisson hint: central limit theorem. (Is this (homework)?)2012-03-16
  • 0
    It is not homework, just personal interest. I picked up the problem here: http://www.mymathforum.com/viewtopic.php?f=24&t=28627.2012-03-16
  • 1
    @wnvl : You should be less formal when you ask questions here and show a little what you've tried or where you are stuck (or admit that you don't know where to start, if that is). We're humans too you know =P2012-03-16
  • 0
    The same question was asked here: http://www.sosmath.com/CBB/viewtopic.php?t=282582012-05-25
  • 0
    The Poisson distribution has the properties that, if the mean is an integer, (a) the median is equal to the mean and (b) the modal values are the mean and one less than the mean. Property (a) implies that the sum in this question is at least $\frac12$ and that without its final term the sum would be less than $\frac12$, with the difference reducing towards $0$ as $n$ increases2017-12-04

1 Answers 1

22

By the definition of Poisson distribution, if in a given interval, the expected number of occurrences of some event is $\lambda$, the probability that there is exactly $k$ such events happening is $$ \frac {\lambda^k e^{-\lambda}}{k!}. $$ Let $\lambda = n$. Then the probability that the Poisson variable $X_n$ with parameter $\lambda$ takes a value between $0$ and $n$ is $$ \mathbb P(X_n \le n) = e^{-n} \sum_{k=0}^n \frac{n^k}{k!}. $$ If $Y_i \sim \mathrm{Poi}(1)$ and the random variables $Y_i$ are independent, then $\sum\limits_{i=1}^n Y_i \sim \mathrm{Poi}(n) \sim X_n$, hence the probability we are looking for is actually $$ \mathbb P\left( \frac{Y_1 + \dots + Y_n - n}{\sqrt n} \le 0 \right) = \mathbb P( Y_1 + \dots + Y_n \le n) = \mathbb P(X_n \le n). $$ By the central limit theorem, the variable $\frac {Y_1 + \dots + Y_n - n}{\sqrt n}$ converges in distribution towards the Gaussian distribution $\mathscr N(0, 1)$. The point is, since the Gaussian has mean $0$ and I want to know when it is less than equal to $0$, the variance doesn't matter, the result is $\frac 12$. Therefore, $$ \lim_{n \to \infty} e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!} = \lim_{n \to \infty} \mathbb P(X_n \le n) = \lim_{n \to \infty} \mathbb P \left( \frac{Y_1 + \dots + Y_n - n}{\sqrt n} \le 0 \right) = \mathbb P(\mathscr N(0, 1) \le 0) = \frac 12. $$

Hope that helps,

  • 2
    Edited some confusion between $X_1$ and $Y_i$, just revert to the previous version if you disagree. // The end of the argument does not apply because $\sigma$ depends on $n$ hence $P(N(1,\sigma)\leqslant1)$ cannot be a limit when $n\to\infty$. The correct approach is to apply the CLT to the event $[X_n\leqslant n]=[(S_n-n)/\sqrt{n}\leqslant0]$ where $S_n=Y_1+\cdots+Y_n$ hence $(S_n-n)/\sqrt{n}$ converges in distribution to $N(0,a)$ for some positive $a$ whose value is irrelevant.2012-03-17
  • 1
    Curious to know how many upvoters understand the answer... :-)2012-03-17
  • 0
    @DidierPiau: Not me. Nice avatar!2012-03-17
  • 0
    @Didier Piau : $\mathbb P( \mathscr N(1,\sigma_n) \le 1 )$ does not depend on $\sigma$, so I understand I did things wrong because I didn't apply CRT the most natural way, but what I said still stands, does it?2012-03-17
  • 0
    @Didier Piau : I've given a little bit more thought about your comment and edited my answer. I was actually worried about the $X /Y$ thing, but you made it clear. And yes, usually when we "switch to the normal approximation" we always subtract the mean and divide through by the variance... I shouldn't have lost that reflex. After reading a little bit on the CLT again I got back on track and agreed with you. I edited my answer to reflect that.2012-03-17
  • 0
    Good job. +1. $ $2012-03-17
  • 0
    @PatrickDaSilva Did you switch lim and P? If so, why are you allowed to do that? If not, what did you do in penultimate step?2015-08-07
  • 1
    @BCLC : That's precisely the CLT : that a sum of i.i.d variables (minus the average divided by standard deviation) converges in distribution to the normal distribution, that is, $\lim_{n \to \infty} \mathbb P \left( \frac {Y_1 + \cdots + Y_n - \mu}{\sigma} \le x \right) \to \mathbb P \left(Z \le x \right)$ when $Y_i$ follows some distribution of mean $\mu$ and variance $\sigma^2$ and $Z \sim \mathcal N(0,1)$. So I did not really switch $\lim$ and $\mathbb P$ properly speaking, I just applied the CLT ; that's because the CLT only guarantees convergence in distribution.2015-08-08
  • 0
    @PatrickDaSilva You mean [this](https://en.wikipedia.org/wiki/Convergence_of_random_variables#Definition) and then plug in x = 0?2015-08-08
  • 1
    @BCLC : Yes, because the normal distribution is given by a smooth (and in particular continuous) density.2015-08-08
  • 0
    is there any other way to prove that sum of n poiss(1) is same in distribution as one poiss(n) variable without using characteristic functions? using characteristic function is easy but im curious if theres another way to prove that distribution are the same2018-04-22
  • 0
    @james black : if you recall the intuition behind the Poisson distribution, it is somewhat obvious. The Poisson distribution tells you the probability that k events happen within a given time interval. When you add two independent Poisson distributions, since the logic remains the same (by independence and time-independence) but the expectations add up... ;)2018-04-22