1
$\begingroup$

I'm trying to understand what it means for a discrete random variable to have a probability mass function (pmf) that is a function of another random variable. For example, one homework problem of mine starts with "suppose $X$ has Poisson distribution with parameter $Y$, where $Y$ has Poisson distribution with parameter $\mu$." Does this mean that to determine the probability that $X=0$, one would first "run" $Y$ to obtain a value, then plug that value into $X$'s pmf? If so, then to determine the probability that $X=1$, would we have to run $Y$ again?

3 Answers 3

3

This means that the probability that $X=k$ knowing $Y$ is a given $p(k,Y)$, hence $\mathrm P(X=k)=\mathrm E(p(k,Y))$. In your case $p(k,y)=\mathrm e^{-y}y^k/k!$ hence $ k!\,\mathrm P(X=k)=\mathrm E(\mathrm e^{-Y}Y^k)=\sum\limits_{n\ge0}\mathrm e^{-\mu}\frac{\mu^n}{n!}\mathrm e^{-n}n^k. $ One sees that $ \mathrm P(X=0)=\sum\limits_{n\ge0}\mathrm e^{-\mu}\frac{(\mu/\mathrm e)^n}{n!}=\mathrm e^{-\mu(1-1/\mathrm e)}. $ Likewise, $ \mathrm P(X=1)=\sum\limits_{n\ge1}\mathrm e^{-\mu}\frac{(\mu/\mathrm e)^n}{(n-1)!}=\mathrm e^{-\mu(1-1/\mathrm e)}\mu/\mathrm e. $ Expectations are easier to compute than $\mathrm P(X=k)$ for a general $k$. For example, noting that the expectation of a Poisson random variable is its parameter, one gets directly $ \mathrm E(X)=\mathrm E(Y)=\mu. $ Likewise, for any positive integer $k$, $ \mathrm E(X(X-1)\cdots(X-k+1))=\mathrm E(Y^k). $

  • 0
    Correction: It would be $e^{-\mu}/(k!)$ times the said $k$th moment. (If it were _just_ the $k$th moment, then we'd have probabilities greater than 1.)2011-10-25
3

This can also be stated in terms of conditional probabilities. You know that, if $Y$ is given/known, $X$ follows a Poisson distribution with that particular value as parameter. You can write this fact down as:

$P(X=x \; |\; Y=y) = e^{-y} \; \frac{\; y^x}{x!}$

Then, the joint probability is given by $P(X \; Y ) = P ( X \; | \; Y) \; P(Y) $, where $P(Y)$ is another Poisson with paramenter $\mu$; and from this you can compute the "marginal" $P(X)$;

$P(X = x ) = \sum_{y=0}^{\infty} P(X=x \; |\; Y=y) \; P(Y=y) $

1

Definitely it ought to say that the conditional distribution of $X$ given $Y$ is a Poisson distribution with expected value $Y$. Without the word "conditional", one could take a statement about the distribution of $X$ to be about its marginal (i.e. "unconditional") distribution, and that would be wrong.