1
$\begingroup$

Suppose that $Y_i \sim 0 \ \ \text{with probability} \ p_i$ and $Y_i \sim \text{Poisson}(\lambda_i) \ \ \text{with probability} \ 1-p_i$

Then why is the same as $Y_i = 0 \ \ \text{with probability} \ p_i+(1-p_i)e^{\lambda_i}$ and $Y=k \ \ \text{with probability} \ (1-p_i)e^{\lambda_i}\lambda_{i}^{k}/k!$

$Y_i = 0$ means that is can be $0$ independent of the Poisson model or $0$ in the Poisson model? That is why we combined the probabilities? I get the probability for $Y_i = k$.

2 Answers 2

1

Imagine that you play the following game. You flip a coin that has probability $p$ of landing heads, and probability $1-p$ of landing tails.

If the result is head, you win $0$ dollars. If the result is tail, then you play a Poisson game with parameter $\lambda$, that is, a game in which you win $k$ dollars with probability $\frac{e^{-\lambda}\lambda^k}{k!}$.

Let random variable $Y$ denote your winnings in the game described above. We want to find the probability distribution of $Y$.

Let's first deal with $P(Y=0)$. You can win $0$ dollars in two ways: (i) the coin flip results in head or (ii) the coin flip results in tail, but the Poisson distribution subgame results in $0$. The probability of (i) is $p$. To find the probability of (ii), note that we must get tail (probability $1-p$) and the Poisson subgame must give result $0$ (probability $\frac{e^{-\lambda}\lambda^0}{0!}$, or more simply $e^{-\lambda}$). Thus $P(Y=0)=p+(1-p)e^{-\lambda}.$

Next we deal with $P(Y=k)$ where $k\gt 0$. To get $k$ dollars, we must have gotten tail on the coin toss (probability $1-p$ and gotten result $k$ in the Poisson subgame (probability $\frac{e^{-\lambda}\lambda^k}{k!}$). So if $k \gt 0$, then $P(Y=k)=(1-p)\frac{e^{-\lambda}\lambda^k}{k!}.$

0

I'm not entirely sure what is meant by a random variable following a distribution with a certain probability. But here is my interpretation. Let $X$ be a random variable such that $P(X=0)=p$ and $P(X=1)=1-p$ for some $p>0$, and let $Z\sim \text{po}(\lambda)$, $\lambda>0$, such that $Z$ and $X$ are independent Then $Y=ZX$ has the property that you ask for. Then $ \{Y=0\}=\{X=0\}\cup(\{Z=0\}\cap\{X=1\}) $ and hence $ P(Y=0)=P(X=0)+P(X=1,Z=0)=P(X=0)+P(X=1)P(Z=0) $ due to independence. Using the definitions of $X$ and $Z$ we obtain $ P(Y=0)=p+(1-p)\frac{\lambda^0}{0!}e^{-\lambda}=p+(1-p)e^{-\lambda}. $