3
$\begingroup$

Assume we have $X_1,...,X_n$ independent Poisson random variables. What is the cdf or pdf of $\sum_{i=1}^{n} X_i$ ?

3 Answers 3

0

Using Central Limit Theorem, It would be approximately a normal random variable.

  • 0
    A Poisson process at a particular time is a binomial random variable, the sum of which would tend to a normal random variable (as $n\to\infty$). However, this is a fixed sum of Poisson variables (fixed $n$), which is a Poisson variable.2012-11-16
2

Try to prove the following result by induction:

Suppose $X_1, X_2 \cdots X_n$ are independent random variables, with each $X_i$ following a Poisson distribution with parameter $\lambda_i$. Then, the random variable, $Z$ defined by, $Z=\sum_{i=1}^nX_i$ follows a Poisson distribution with parameter $\sum\limits_{i=1}^n\lambda_i$.

I'll add in a proof if you still don't see a proof.

Hint: Obtain the probability generating function of a Poisson random variable. Knowing that, the set of non-negative random variables are in one-to-one correspondence with the set of all probability generating functions, and that, product of probability generating functions is the probability of the sum, given independence, cook up a recipe for the proof.

Disclaimer: There are elementary proofs, but a little painful. The proof above is simple if you have the necessary back-up. Else, this method is even more painful.

  • 1
    I don't quite understand what you mean there. Sorry. Being a bit more elaborate helps. : )2012-03-07
2

In the case where you have two variables, the following holds:

Suppose $X_1$ has Poisson distribution with parameter $\lambda_1$ and $X_2$ has Poisson distribution with parameter $\lambda_2$. Then if $X_1$ and $X_2$ are independent, the variable $X_1+X_2$ has Poisson distribution with parameter $\lambda_1+\lambda_2$.

This is intuitively clear if we regard the variables as relating to Poisson processes with common unit time. $X_1$ gives the number of events occurring in a unit time period where the average number of events per unit time period is $\lambda_1$. $X_2$ gives the number of events occurring in a unit time period where the average number of events per unit time period is $\lambda_2$. By the independence assumption, the total number of events from both processes occurring in a unit time period would be $X_1+X_2$, and the average number of these events per unit time period would be $\lambda_1+\lambda_2$. So, $X_1+X_2$ has Poisson distribution with parameter $\lambda_1+\lambda_2$.

Rigorously, we can compute the probability mass function, $p_Y$, of $Y=X_1+X_2$ as follows:

For our variables $X_1$ and $X_2$, we have for $i\ge0$: $P[X_1=i]= {\lambda_1^i\over i!} e^{-\lambda_1}\quad\text{and}\quad P[X_2=i]= {\lambda_2^i\over i!} e^{-\lambda_2}.$

Let $k\ge0$. Then: $ \eqalign{ p_Y(k) &=\sum_{i=0}^kP[X_1=i,X_2=k-i]\cr &=\sum_{i=0}^kP[X_1=i]\cdot P[ X_2=k-i]\cr &=\sum_{i=0}^k{\lambda_1^i\over i!}e^{-\lambda_1}\cdot{\lambda_2^{k-i}\over(k-i)!}e^{-\lambda_2}\cr &=\sum_{i=0}^k{\lambda_1^i\lambda_2^{k-i}\over i!\,(k-i)!}e^{-(\lambda_1+\lambda_2)}\cr &= e^{-(\lambda_1+\lambda_2)}\cdot\sum_{i=0}^k{\lambda_1^i\lambda_2^{k-i}\over i!\,(k-i)!}\cr &={1\over k!}\cdot e^{-(\lambda_1+\lambda_2)}\cdot\sum_{i=0}^k{k!\over i!\,(k-i)!}\,\lambda_1^i\lambda_2^{k-i}\cr % &={1\over k!}\cdot e^{-(\lambda_1+\lambda_2)}\cdot\sum_{i=1}^k{k!\over i!\,(k-i)!}\,\lambda_1^i\lambda_2^{k-i}\cr &={(\lambda_1+\lambda_2)^k\over k!}\cdot e^{-(\lambda_1+\lambda_2)},\cr } $ where the second equality above used the independence of $X_1$ and $X_2$ and the last equality used the Binomial Theorem.

So, $ p_Y(k)= {(\lambda_1+\lambda_2)^k\over k!}\cdot e^{-(\lambda_1+\lambda_2)},\quad k\ge 0 ; $ which we recognize as the Poisson distribution with parameter $\lambda_1+\lambda_2$.

The result for $n\ge 1$ now follows easily by induction.