8
$\begingroup$

The classic example... $X \sim Po\left (\lambda\right ), Y \sim Po\left (\mu\right)$, X and Y are independent. Show that the conditional distribution of X is binomially distributed. Or in other words, $P(X=k\mid X+Y = n) = P (\tilde{X} = k), \tilde{X} \sim B\left (n ,\frac{\lambda}{\lambda + \mu}\right )$.

I've so far managed to reach to this step, and have been stuck since. Just somehow gotta get a $\frac{1}{n!}$ in the denominator, that would then complete the proof..or at least I think..

$P(X=k\mid X+Y=n) = \frac{\frac{\lambda^{k}\mu^{n-k}}{k!(n-k)!}}{P(X+Y = n)}= \frac{\frac{\lambda^{k}\mu^{n-k}}{k!(n-k)!}}{\sum_{i=1}^{n} \frac{\lambda^{i}\mu^{n-i}}{i!(n-i)!}}$

Thanks for the help!

  • 0
    Sure. Binomial $n$ trials prob. of success each trial $\frac{\lambda}{\lambda+\mu}$.2012-05-30

2 Answers 2

7

HINT

Remember that $\sum_{i=0}^{n} \binom{n}{i} \lambda^i \mu^{n-i} = \left( \lambda + \mu\right)^n$ The above gives us that $\sum_{i=0}^{n} \dfrac{n!}{i! (n-i)!} \lambda^i \mu^{n-i} = \left( \lambda + \mu\right)^n$ which inturn gives us that $\sum_{i=0}^{n} \dfrac{ \lambda^i \mu^{n-i}}{i! (n-i)!} = \dfrac{\left( \lambda + \mu\right)^n}{n!}$

0

Instead of using the summation in the denominator to calculate $P(X+Y=n)$, we might use the property that the sum of two independent Poisson distributions is a Poisson distribution, whose parameter is the sum of the parameters of two independent Poisson (can be shown using mgf; see Theorem 3.2.1 in Introduction to Mathematical Statistics by Hogg et al.). Therefore $P(X+Y=n)=\frac{1}{n!}(\lambda+\mu)^ne^{-(\lambda+\mu)}$, which can be plugged into the denominator of last expression.