2
$\begingroup$

From Wikipedia:

  1. In some cases, the cdf of the Poisson distribution is the limit of the cdf of the binomial distribution:

    The Poisson distribution can be derived as a limiting case to the binomial distribution $\text{bin}(n,p)$ as the number $n$ of trials goes to infinity and the expected number $np$ of successes remains fixed — see law of rare events below. Therefore it can be used as an approximation of the binomial distribution if $n$ is sufficiently large and $p$ is sufficiently small.

  2. In some cases, the cdf of the Poisson distribution is the limit of the cdf of the normal distribution:

    For sufficiently large values of $λ$, (say $λ>1000$), the normal distribution with mean $λ$ and variance $λ$ (standard deviation $\sqrt{\lambda}$), is an excellent approximation to the Poisson distribution. If $λ$ is greater than about $10$, then the normal distribution is a good approximation if an appropriate continuity correction is performed, i.e., $P(X ≤ x)$, where (lower-case) $x$ is a non-negative integer, is replaced by $P(X ≤ x + 0.5)$. $$ F_\mathrm{Poisson}(x;\lambda) \approx F_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)\, $$

I wonder in what best senses these functional approximations are? Pointwise, uniformly, $L_2$, ...? Thanks and regards!

2 Answers 2

1

"In some cases, the cdf of the Poisson distribution is the limit of the cdf of the normal distribution:"

That is false. The limit of Poisson distributions is a normal distribution; the limit of normal distributions is not a Poisson distribution.

"Best senses" is a term I've never come across and I don't know what you mean by it.

The sequence of cdf's of Poisson distributions converges pointwise to the cdf of the normal distribution. Suppose $X\sim\mathrm{Poisson}(\lambda)$. Then the cdf of $(X-\lambda)/\sqrt{\lambda}$ converges to the cdf of the standard normal distribution as $\lambda\to\infty$.

Now suppose $X\sim\mathrm{Bin}(n,\lambda/n)$. Then the cdf of $X$ converges pointwise to the cdf of the Poisson distribution with expectation $\lambda$ as $n\to\infty$.

Generally "convergence in distribution" means a sequence of cdf's converges to a cdf pointwise except that it need not converge at points where the limiting cdf is not continuous. The reason for the exception is things like this: Concentrate probability $1$ at $1/n$. Then the value of the cdf at $0$ is $0$. But the limiting cdf has value $1$ at $0$; that's where it concentrates all the probability.

But don't write about normal distributions approaching a Poisson distribution. It's the other way around.

  • 0
    Thanks! But that is what I understood from the Wikipedia article, unless it is wrong.2011-10-09
  • 0
    @Tim, WP is right on this, you might wish to read the paragraph again.2011-10-09
  • 0
    Wikipedia says the normal distribution approximates the Poisson distribution under certain circumstances. But that doesn't mean the normal is approaching the Poisson. It's the Poisson that's approaching the normal.2011-10-09
  • 1
    @Didier and Michael: Thanks! According to the Wiki article, (1) in the binomial distribution part, the convergence is when $n \rightarrow \infty$ with $np$ fixed, not $\lambda \rightarrow \infty$. Note $\lambda = np$. (2) in the normal distribution part, the convergence is $\lim_{\lambda \rightarrow \infty} \text{discrepancy}( F_\mathrm{Poisson}(x;\lambda), F_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)) = 0$, i.e. the two cdfs are approaching each other under some measure of discrepancy, instead of one approaching the other.2011-10-09
  • 0
    (3)Michael, in your reply, when you mentioned that the poisson distributions approach the standard normal distribution, is it the same as the Central Limit Theorem, or are there some differences?2011-10-09
  • 0
    If $\lambda$ takes only integer values while approacing $\infty$, then the result follows from the central limit theorem together with the fact that the sum of $\lambda$ independent random variables with expectation $1$ is a Poisson random variable with expectation $\lambda$. Allowing $\lambda$ to take real values complicates the argument a bit, but that's not hard to overcome.2011-10-10
  • 0
    Michael, Thanks for answering my question (3). I wonder why "the sum of λ independent random variables with expectation 1 is a Poisson random variable with expectation λ"? Also do you have some idea as to questions (1) and (2) in my previous comment?2011-10-10
  • 0
    @Tim, I see no question in your (1) and (2).2011-10-10
  • 0
    @Didier: Refering to Michael's reply and my previous comment: for my (1), I would like to know if $n \rightarrow \infty$ with $np$ fixed is correct, or $\lambda \rightarrow \infty$ in Michael's reply is correct; for my (2), I would like to know if my understanding of the two kinds of distributions approaching each other is correct, or one distribution approaching another in Michael's reply is correct. If I am correct, in what way do the two approach each other? (Let me know if I am still not clear. Thanks!)2011-10-10
  • 0
    @Tim, thanks for the explanations, I was too hasty in my first reading. In your (1), $\lambda\to\infty$ is a typo and should be replaced by $n\to\infty$ and $p\to0$ in a way such that $np\to\lambda$ with $\lambda$ positive and finite. In your (2), the mathematical result is that any $X_\lambda$ Poisson$(\lambda)$ are such that $(X_\lambda-\lambda)/\sqrt{\lambda}$ converges in distribution to a standard Gaussian $Z$. Since $Z_\lambda=\sqrt{\lambda}Z+\lambda$ is a Gaussian with mean $\lambda$ and variance $\lambda$, one could be tempted to write .../...2011-10-10
  • 0
    .../... that $X_\lambda$ and $Z_\lambda$ have similar distributions. This needs some care since, for example, $X_\lambda$ is an integer with probability one and $Z_\lambda$ is an integer with probability zero. What you have, though, are bounds on the total variation distance between Poisson and normal random variables, often referred to by the generic name of *Stein's method*.2011-10-10
  • 0
    @Didier: Thanks! Nice to learn from you.2011-10-10
  • 0
    Michael, I wonder why "the sum of λ independent random variables with expectation 1 is a Poisson random variable with expectation λ"?2011-10-10
  • 0
    @Tim: Generally, the sum of two independent Poisson-distributed random variables is also a Poisson-distributed random variable. And once it works for two of them, it works for any finite number of them, as can be shown by mathematical induction. If $X\sim\mathrm{Poisson}(\lambda)$ and $Y\sim\mathrm{Poisson}(\mu)$, then by the usual additivity of expectations, the expected value of $X+Y$ must be $\lambda+\mu$ (whether $X$ and $Y$ are independent or not), so _if_ $X+Y$ is Poisson-distributed, it must be that $X+Y\sim\mathrm{Poisson}(\lambda+\mu)$. So the question is:2011-10-10
  • 0
    ...why must we have $X+Y\sim\mathrm{Poisson}(\lambda+\mu)$, if the assumptions above hold? I'll post a separate answer treating that question.2011-10-10
  • 0
    Michael, thanks! I thought you were saying $X$ and $Y$ have arbitrary distributions as long as they are independent with expectation 1. Now I don't think this way.2011-10-10
  • 1
    I would have thought that the "Now suppose ..." paragraph should end "... as $n \to \infty$" rather than "... as $\lambda \to \infty$"2013-09-20
  • 0
    @Henry : Correct. I've fixed it. Clearly $\lambda$ cannot be what's approaching $\infty$ if $\lambda$ is part of the value of the limit.2013-09-20
1

In comments above, Tim asked why it must be that if $X\sim\mathrm{Poisson}(\lambda)$ and $Y\sim\mathrm{Poisson}(\mu)$ and $X$ and $Y$ are independent, then we must have $X+Y\sim\mathrm{Poisson}(\lambda+\mu)$.

Here's one way to show that. $$ \begin{align} & \Pr(X+Y= w) \\ & = \Pr\Big( (X=0\ \& \ Y=w)\text{ or }(X=1\ \&\ Y=w-1)\text{ or }(X=2\ \&\ Y=w-2)\text{ or }\ldots \\ & {}\qquad\qquad\qquad\ldots\text{ or }(X=0\ \&\ Y=w)\Big) \\ & = \sum_{u=0}^w \Pr(X=u)\Pr(Y=w-u)\qquad(\text{independence was used here}) \\ & = \sum_{u=0}^w \frac{\lambda^u e^{-\lambda}}{u!} \cdot \frac{\mu^{w-u} e^{-\mu}}{(w-u)!} \\ & = e^{-(\lambda+\mu)} \sum_{u=0}^w \frac{1}{u!(w-u)!} \mu^u\lambda^{w-u} \\ & = \frac{e^{-(\lambda+\mu)}}{w!} \sum_{u=0}^w \frac{w!}{u!(w-u)!} \mu^u\lambda^{w-u} \\ & = \frac{e^{-(\lambda+\mu)}}{w!} (\lambda+\mu)^w \end{align} $$ and that is what was to be shown.