44
$\begingroup$

$X \sim \mathcal{P}( \lambda) $ and $Y \sim \mathcal{P}( \mu)$ meaning that $X$ and $Y$ are Poisson distributions. What is the probability distribution law of $X + Y$. I know it is $X+Y \sim \mathcal{P}( \lambda + \mu)$ but I don't understand how to derive it.

  • 0
    Try using the method of moment generating functions :)2013-11-25
  • 0
    All I've learned in the definition of a Poisson Random Variable, is there a simpler way?2013-11-25
  • 4
    If they are `independent`.2013-11-25
  • 0
    Doesn’t it suffice that their covariance vanishes?2018-02-01

7 Answers 7

81

This only holds if $X$ and $Y$ are independent, so we suppose this from now on. We have for $k \ge 0$: \begin{align*} P(X+ Y =k) &= \sum_{i = 0}^k P(X+ Y = k, X = i)\\ &= \sum_{i=0}^k P(Y = k-i , X =i)\\ &= \sum_{i=0}^k P(Y = k-i)P(X=i)\\ &= \sum_{i=0}^k e^{-\mu}\frac{\mu^{k-i}}{(k-i)!}e^{-\lambda}\frac{\lambda^i}{i!}\\ &= e^{-(\mu + \lambda)}\frac 1{k!}\sum_{i=0}^k \frac{k!}{i!(k-i)!}\mu^{k-i}\lambda^i\\ &= e^{-(\mu + \lambda)}\frac 1{k!}\sum_{i=0}^k \binom ki\mu^{k-i}\lambda^i\\ &= \frac{(\mu + \lambda)^k}{k!} \cdot e^{-(\mu + \lambda)} \end{align*} Hence, $X+ Y \sim \mathcal P(\mu + \lambda)$.

  • 1
    Thank you! but what happens if they are not independent?2012-10-25
  • 8
    In general we can't say anything then. It depends on how they depend on another.2012-10-25
  • 1
    Thank you! it's very simple and I feel like a complete idiot.2012-10-25
  • 1
    Nice derivation: specifically the transformation of (a) the i/k factorials and (b) the mu/lambda polynomials into the binomial form of the polynomial power expression.2014-08-30
  • 0
    ty for that answer but what i don't get is how come u've taken 1/k! outside the sum? is it to say that k! *1/k! would be 1?2017-08-08
  • 1
    @LiorA Yes. k! included to combine with the rest and simplify as intended, so 1/k! is included to compensate.2018-01-07
  • 1
    If the sum is indeed poisson, and we conversely say that X, Y MUST be independent? You said it only holds if independent at the beginning. I am struggling to come up with a proof here2018-03-16
18

Another approach is to use characteristic functions. If $X\sim \mathrm{po}(\lambda)$, then the characteristic function of $X$ is (if this is unknown, just calculate it) $$ \varphi_X(t)=E[e^{itX}]=e^{\lambda(e^{it}-1)},\quad t\in\mathbb{R}. $$ Now suppose that $X$ and $Y$ are independent Poisson distributed random variables with parameters $\lambda$ and $\mu$ respectively. Then due to the independence we have that $$ \varphi_{X+Y}(t)=\varphi_X(t)\varphi_Y(t)=e^{\lambda(e^{it}-1)}e^{\mu(e^{it}-1)}=e^{(\mu+\lambda)(e^{it}-1)},\quad t\in\mathbb{R}. $$ As the characteristic function completely determines the distribution, we conclude that $X+Y\sim\mathrm{po}(\lambda+\mu)$.

8

You can use Probability Generating Function(P.G.F). As poisson distribution is a discrete probability distribution, P.G.F. fits better in this case.For independent X and Y random variable which follows distribution Po($\lambda$) and Po($\mu$). P.G.F of X is \begin{equation*} \begin{split} P_X[t] = E[t^X]&= \sum_{x=0}^{\infty}t^xe^{-\lambda}\frac{\lambda^x}{x!}\\ &=\sum_{x=0}^{\infty}e^{-\lambda}\frac{(\lambda t)^x}{x!}\\ &=e^{-\lambda}e^{\lambda t}\\ &=e^{-\lambda (1-t)}\\ \end{split} \end{equation*} P.G.F of Y is \begin{equation*} \begin{split} P_Y[t] = E[t^Y]&= \sum_{y=0}^{\infty}t^ye^{-\mu}\frac{\mu^y}{y!}\\ &=\sum_{y=0}^{\infty}e^{-\mu}\frac{(\mu t)^y}{y!}\\ &=e^{-\mu}e^{\mu t}\\ &=e^{-\mu (1-t)}\\ \end{split} \end{equation*}

Now think about P.G.F of U = X+Y. As X and Y are independent, \begin{equation*} \begin{split} P_U(t)=P_{X+Y}(t)=P_X(t)P_Y(t)=E[t^{X+Y}]=E[t^X t^Y]&= E[t^X]E[t^Y]\\ &= e^{-\lambda (1-t)}e^{-\mu (1-t)}\\ &= e^{-(\lambda+\mu) (1-t)}\\ \end{split} \end{equation*}

Now this is the P.G.F of $Po(\lambda + \mu)$ distribution. Therefore,we can say U=X+Y follows Po($\lambda+\mu$)

4

In short, you can show this by using the fact that $$Pr(X+Y=k)=\sum_{i=0}^kPr(X+Y=k, X=i).$$

If $X$ and $Y$ are independent, this is equal to $$ Pr(X+Y=k)=\sum_{i=0}^kPr(Y=k-i)Pr(X=i) $$ which is $$ \begin{align} Pr(X+Y=k)&=\sum_{i=0}^k\frac{e^{-\lambda_y}\lambda_y^{k-i}}{(k-i)!}\frac{e^{-\lambda_x}\lambda_x^i}{i!}\\ &=e^{-\lambda_y}e^{-\lambda_x}\sum_{i=0}^k\frac{\lambda_y^{k-i}}{(k-i)!}\frac{\lambda_x^i}{i!}\\ &=\frac{e^{-(\lambda_y+\lambda_x)}}{k!}\sum_{i=0}^k\frac{k!}{i!(k-i)!}\lambda_y^{k-i}\lambda_x^i\\ &=\frac{e^{-(\lambda_y+\lambda_x)}}{k!}\sum_{i=0}^k{k\choose i}\lambda_y^{k-i}\lambda_x^i \end{align} $$ The sum part is just $$ \sum_{i=0}^k{k\choose i}\lambda_y^{k-i}\lambda_x^i=(\lambda_y+\lambda_x)^k $$ by the binomial theorem. So the end result is $$ \begin{align} Pr(X+Y=k)&=\frac{e^{-(\lambda_y+\lambda_x)}}{k!}(\lambda_y+\lambda_x)^k \end{align} $$ which is the pmf of $Po(\lambda_y+\lambda_x)$.

  • 0
    Moderator notice: This answer was moved here as a consequence of merging two questions. This explains the small differences in notation. The OP's $\lambda$ is $\lambda_x$ here, and OP's $\mu$ is $\lambda_y$. Otherwise there is no difference.2015-04-23
2

Using Moment Generating Function.

If $X \sim \mathcal{P}(\lambda)$, $Y \sim \mathcal{P}(\mu)$ and S=X+Y.
We know that MGF(Moment Generating Function) of $\mathcal{P}(\lambda)=e^{\lambda(e^t-1)}$(See the end if you need proof)
MGF of S would be $$\begin{align} M_S(t)&=E[e^{tS}]\\&=E[e^{t(X+Y)}]\\&=E[e^{tX}e^{tY}]\\&=E[e^{tX}]E[e^{tY}]\quad \text{given }X,Y\text{ are independent}\\&=e^{\lambda(e^t-1)}e^{\mu(e^t-1)}\\&=e^{(\lambda+\mu)(e^t-1)} \end{align}$$
Thus S is a Poisson Distribution with parameter $\lambda+\mu$.


MGF of Poisson Distribution

If $X \sim \mathcal{P}(\lambda)$, then by definition Probability Mass Function is
$$\begin{align} f_X(k)=\frac{\lambda^k}{k!}e^{-\lambda},\quad k \in 0,1,2.... \end{align}$$ It's MGF is $$\begin{align} M_X(t)&=E[e^{tX}]\\&=\sum_{k=0}^{\infty}\frac{\lambda^k}{k!}e^{-\lambda}e^{tk}\\&=e^{-\lambda}\sum_{k=0}^{\infty}\frac{\lambda^ke^{tk}}{k!}\\&=e^{-\lambda}\sum_{k=0}^{\infty}\frac{(\lambda e^t)^k}{k!}\\&=e^{-\lambda}e^{\lambda e^t}\\&=e^{\lambda e^t-\lambda}\\&=e^{\lambda(e^t-1)} \end{align}$$

1

hint: $\sum_{k=0}^{n} P(X = k)P(Y = n-k)$

  • 0
    why this hint, why the sum? This is what I don't understand2012-10-25
  • 0
    adding two random variables is simply convolution of those random variables. That's why.2012-10-25
  • 0
    gotcha! Thanks!2012-10-25
  • 0
    *adding two random variables is simply convolution of those random variables*... Sorry but no.2013-02-13
  • 0
    @Did I meant in the usual pdf sense and assumes independence of course.2013-02-13
  • 1
    There is no `usual sense` for convolution of random variables. Either convolution of distributions or addition of random variables.2013-02-13
0

Here's a much cleaner solution:

Consider a two Poisson processes occuring with rates $\lambda$ and $\mu$, where a Poisson process of rate $r$ is viewed as the limit of $n$ consecutive Bernoulli trials each with probability $\frac{r}{n}$, as $n\to\infty$.

Then $X$ counts the number of successes in the trials of rate $\lambda$ and $Y$ counts the number of successes in the trials of rate $\mu$, so the total number of successes is the same as if we had each trial succeed with probability $\frac{\lambda + \mu}{n}$, where we take $n$ to be large enough so that the event where the $i$th Bernoulli trial in both processes are successdul has a negligible probability. Then we are done.