2
$\begingroup$

From Wikipedia:

  1. In some cases, the cdf of the Poisson distribution is the limit of the cdf of the binomial distribution:

    The Poisson distribution can be derived as a limiting case to the binomial distribution $\text{bin}(n,p)$ as the number $n$ of trials goes to infinity and the expected number $np$ of successes remains fixed — see law of rare events below. Therefore it can be used as an approximation of the binomial distribution if $n$ is sufficiently large and $p$ is sufficiently small.

  2. In some cases, the cdf of the Poisson distribution is the limit of the cdf of the normal distribution:

    For sufficiently large values of $λ$, (say λ>1000), the normal distribution with mean $λ$ and variance $λ$ (standard deviation $\sqrt{\lambda}$), is an excellent approximation to the Poisson distribution. If $λ$ is greater than about $10$, then the normal distribution is a good approximation if an appropriate continuity correction is performed, i.e., $P(X ≤ x)$, where (lower-case) $x$ is a non-negative integer, is replaced by $P(X ≤ x + 0.5)$. $ F_\mathrm{Poisson}(x;\lambda) \approx F_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)\, $

I wonder in what best senses these functional approximations are? Pointwise, uniformly, $L_2$, ...? Thanks and regards!

2 Answers 2

1

"In some cases, the cdf of the Poisson distribution is the limit of the cdf of the normal distribution:"

That is false. The limit of Poisson distributions is a normal distribution; the limit of normal distributions is not a Poisson distribution.

"Best senses" is a term I've never come across and I don't know what you mean by it.

The sequence of cdf's of Poisson distributions converges pointwise to the cdf of the normal distribution. Suppose $X\sim\mathrm{Poisson}(\lambda)$. Then the cdf of $(X-\lambda)/\sqrt{\lambda}$ converges to the cdf of the standard normal distribution as $\lambda\to\infty$.

Now suppose $X\sim\mathrm{Bin}(n,\lambda/n)$. Then the cdf of $X$ converges pointwise to the cdf of the Poisson distribution with expectation $\lambda$ as $n\to\infty$.

Generally "convergence in distribution" means a sequence of cdf's converges to a cdf pointwise except that it need not converge at points where the limiting cdf is not continuous. The reason for the exception is things like this: Concentrate probability $1$ at $1/n$. Then the value of the cdf at $0$ is $0$. But the limiting cdf has value $1$ at $0$; that's where it concentrates all the probability.

But don't write about normal distributions approaching a Poisson distribution. It's the other way around.

  • 0
    @Henry : Correct. I've fixed it. Clearly $\lambda$ cannot be what's approaching $\infty$ if $\lambda$ is part of the value of the limit.2013-09-20
1

In comments above, Tim asked why it must be that if $X\sim\mathrm{Poisson}(\lambda)$ and $Y\sim\mathrm{Poisson}(\mu)$ and $X$ and $Y$ are independent, then we must have $X+Y\sim\mathrm{Poisson}(\lambda+\mu)$.

Here's one way to show that. $ \begin{align} & \Pr(X+Y= w) \\ & = \Pr\Big( (X=0\ \& \ Y=w)\text{ or }(X=1\ \&\ Y=w-1)\text{ or }(X=2\ \&\ Y=w-2)\text{ or }\ldots \\ & {}\qquad\qquad\qquad\ldots\text{ or }(X=0\ \&\ Y=w)\Big) \\ & = \sum_{u=0}^w \Pr(X=u)\Pr(Y=w-u)\qquad(\text{independence was used here}) \\ & = \sum_{u=0}^w \frac{\lambda^u e^{-\lambda}}{u!} \cdot \frac{\mu^{w-u} e^{-\mu}}{(w-u)!} \\ & = e^{-(\lambda+\mu)} \sum_{u=0}^w \frac{1}{u!(w-u)!} \mu^u\lambda^{w-u} \\ & = \frac{e^{-(\lambda+\mu)}}{w!} \sum_{u=0}^w \frac{w!}{u!(w-u)!} \mu^u\lambda^{w-u} \\ & = \frac{e^{-(\lambda+\mu)}}{w!} (\lambda+\mu)^w \end{align} $ and that is what was to be shown.