1
$\begingroup$

Question Is this supposed to mean, that I have to actually find Poisson distributed random variable that for a fixed binomial distributed random variable approximates it for the values for which their images coincide (note that a Poisson distributed random variable necesserily has as image $\mathbb{N}$ (with $0$) but a binomial distributed random variable has as image only $\left\{ 0,\ldots,n\right\} $) ?

How is this approximation to be understood (in the illustrative example I have formally written it out) on the level of random variables, since it seems to me, that this is just a "pointwise" (meaning the points in the image of the random variable) approximation...

Illustrative Example I have to calculate the using the binomial and Poisson distribution the probability of getting at most $4$ times the number $1$ in a series of $1000$ games played games, where in each game, randomly on number ist picked ot of the set $\left\{ 1,\ldots,50\right\} $.

I know, that if $X$ is the random variable that counts if how often a $1$ has come up, then the distribution of $X$ is the binomial distrubution, since each instance of a games is a Bernoulli trial (either a $1$ has come up - with probability $p=\frac{1}{50}$ - or it hasn't), so I only have to calculate $ \sum_{k=0}^{4}P\left(X=k\right) $

which is not a difficult task!

But for the Poisson distribution a problem arises: We proved in our course a theorem concerning the binomial and Poisson distribution, that says, that the latter approximates the former, if the number of Bernoulli trials is very big. Formal statement: If $p_{n}$ is a sequence in the interval $\left[0,1\right]$ and $np_{n}\rightarrow\lambda$, then $ P\left(X=k\right)=\binom{n}{k}p_{n}^{k}(1-p_{n})^{n-k}\rightarrow e^{-\lambda}\frac{\lambda^{k}}{k!}\ \text{for}\ n\rightarrow\infty. $

Now I could of course just say that $ P\left(X=k\right)\approx e^{-\lambda}\frac{\lambda^{k}}{k!}\ \text{for}\ \lambda=np $

and then just calculate $ \sum_{k=0}^{4}e^{-np}\frac{(np)^{k}}{k!}. $

But this just doesn't seems right, since there isn't any random variable involved, that actually is $"e^{-np}\frac{(np)^{k}}{k!}"$-distributed. And I thought I would have to exhibit such a random variable and then somehow use the above approximation, because their images have (as noted at the start) different cardinalities.

  • 0
    No, that is **not at all** the way it is done. One looks at the two *cumulative distribution functions* $F_X(x)$ and $F_Y(y)$, or more properly when there is a sequence of random variables, $F_{X_n}(x)$ and $F_Y(x)$. Then we compare $F_X(x)$ and $F_Y(x)$, which are just ordinary functions, same domain. There is an important complication, which is that we compare the two (or many) cdf at points of continuity. My post above was to give a quick intuition suited to this particular, without the technical complications.2012-05-02

1 Answers 1

1

The random variable is $X$. I understand their cardinalities are different but the core concept is that the probability distribution function of a binomial distribution $B(n,p)$ converges to that of a Poisson distribution $\pi(np)$ (as $n\rightarrow\infty,p\rightarrow0$). That being said, let $Y\sim\pi(np)$, the probability that $Y\ge n+1$ is negligible, such that ignoring them will not introduce any serious problem.

Please be advised that it is the probability distribution function of a binomial distribution that converges to the probability distribution function of a Poisson distribution (with conditions aforementioned, of course). This Poisson limit theorem doesn't say anything about "one distribution converges to another". See wiki: http://en.wikipedia.org/wiki/Poisson_limit_theorem.

  • 0
    Sorry I meant $X$ (Since we are considering the probability that $X=k$... I admit I made a mistake.) Please have a look at my editted answer.2012-05-02