0
$\begingroup$

In probability, one classic proof is that a binomial distribution converges to a Poisson distribution as $n \to \infty$. That is, suppose $X \sim \text{Bin}(n, p)$ with moment-generating function $$M_{X}(t) = [pe^{t}+(1-p)]^{n}$$ and $Y \sim \text{Poi}(\lambda)$ with moment-generating function $$M_{Y}(t) = e^{\lambda(e^{t}-1)}\text{.}$$ Then, with $\lambda = np$, $\lim_{n \to \infty}M_X(t) = M_{Y}(t)$.

Showing this would involve writing $$M_{X}(t) = \left(1+\dfrac{np(e^{t}-1)}{n} \right)^{n} = \left(1+\dfrac{\lambda(e^{t}-1)}{n} \right)^{n}\text{.}$$ The problem I have with this is as follows: isn't it the result that if $x$ is a constant - that is, not dependent on $n$ - that $$\lim_{n \to \infty}\left(1+\dfrac{x}{n}\right)^{n} = e^{x}$$ and this doesn't seem to make any sense since $\lambda$ is dependent on $n$? Or are there conditions imposed on $p$ or $\lambda$ to guarantee the limit?

  • 0
    Not my field, but is $p$ constant?2017-01-02
  • 0
    @Lubin Yes, usually $p \in (0, 1)$.2017-01-02
  • 0
    Then the limit of the middle member of your three-member display is clearly $\exp(p(e^t-1))$, unless of course $t$ is varying with $n$.2017-01-02
  • 3
    @Clarinetist I believe $\lambda$ is fixed so that the product $np$ is constant ($p$ depends on $n$). It's like how the equation $1=\cos^2(\theta)+\sin^2(\theta)$ depends on theta on the right but not the left.2017-01-02
  • 0
    Yes, @user375366, certainly all depends on what’s constant and what’s varying with $n$.2017-01-02
  • 0
    Thank you @Lubin, @user375366!2017-01-02
  • 0
    In your context, $x$ does not need to be a constant (see, Casella & Berger P. 67, http://people.unica.it/musio/files/2008/10/Casella-Berger.pdf). For the assumptions, you only need $np$ is relatively small and then set $\lambda=np$ in the limit used in Lemma 2.3.14 in Casella & Berger. Hope this help a bit.2017-01-02

0 Answers 0