1
$\begingroup$

Suppose $X$ has the $\mathrm{Poisson}(5)$ distribution considered earlier.

Then $P(X \in A) = \sum_{j\in A} \frac{e^{-5}5^j}{j!}$, which implies that $L(X) = \sum^\infty_{j=0} \left(\frac{e^{-5}5^j}{j!}\right)\delta_j$, a convex combination of point masses. The following propostion shows that we have $E(f(X)) = \sum_{j=0}^\infty\frac{f(j)e^{-5}5^j}{j!}$ for any function $f : \mathbb{R} \rightarrow \mathbb{R}$.

Prop. Suppose $\mu = \sum_i \beta \mu_I$ where $\{\mu_i\}$ are probability distributions, and $\{\beta_i\}$ are non-negative constants (summing to 1, if we want $\mu$ to also be a probability distribution). Then for Borel-measurable functions $f : \mathbb{R} \rightarrow \mathbb{R}$,

$\int fd\mu = \sum_i \beta_i \int f \, d\mu_i,$

provided either side is well-defined.

Using this proposition:

Let $X \sim \mathrm{Poisson}(5)$.

(a) compute *E*$(X)$ and *Var*$(X)$.

(b) compute *E*$(3^X)$.

I know that the answers from a previous question are: $E[X]=\lambda$, $E[X^2]=\lambda+\lambda^2$ (from the previous two, you could compute the variance) and $E[e^{3X}]= e^{2 \lambda}$.

However, I'm not sure as to how to get to this other than I'm supposed to use a Taylor series.

  • 0
    possible duplicate of [Poisson distribution and probability distributions](http://math.stackexchange.com/questions/249623/poisson-distribution-and-probability-distributions)2012-12-04

1 Answers 1

2

The Poisson distribution is the limit of the binomial distribution as the number of trials approaches infinity. The binomial distribution function is given by $p(y) = {n \choose y} p^y (1-p)^{n-y}.$ Let $\lambda = np$, and take the limit as $n \to \infty$. $\lim_{n \to \infty} {n \choose y} p^y (1-p)^{n-y} = \lim_{n \to \infty} \frac{n(n-1) \cdots (n-y+1)}{y!} \left(\frac{\lambda}{n}\right)^y \left(1 - \frac{\lambda}{n}\right)^{n-y}$ $= \lim_{n \to \infty} \frac{\lambda^y}{y!} \left(1 - \frac{\lambda}{n}\right)^n \frac{n(n-1) \cdots (n-y+1)}{n^y} \left(1 - \frac{\lambda}{n}\right)^{-y}$ $= \frac{\lambda^y}{y!} \lim_{n \to \infty} \left(1 - \frac{\lambda}{n}\right)^n \left(1 - \frac{\lambda}{n}\right)^{n-y} \left(1 - \frac{1}{n}\right) \left(1 - \frac{2}{n}\right) \cdots \left(1 - \frac{y-1}{n}\right).$ Then, we can use the identity $\lim_{n \to \infty} \left(1 - \frac{\lambda}{n}\right)^n = e^{-\lambda},$ and since the rest of the terms in the limit converge to $1$, we have $p(y) = \frac{\lambda^y}{y!}e^{-\lambda}.$

Now, we can find the expected value and variance using $E(Y) = \sum_{y=0}^\infty y\frac{\lambda^y e^{-\lambda}}{y!} = \sum_{y=1}^\infty \frac{\lambda^y e^{-\lambda}}{(y-1)!}.$ Factor out $\lambda$, and substitute $z = y - 1$ to get $E(Y) = \lambda \sum_{z=0}^\infty \frac{\lambda^z e^{-\lambda}}{z!}.$ Since the sum is over an entire probability distribution, it converges to $1$, and thus the expected value is $\lambda$.

The variance is given by $V(Y) = E(Y^2) - E(Y)^2.$ We already know $E(Y)^2 = \lambda^2$, so what remains is $E(Y^2)$. $E(Y^2) = \sum_{y=0}^\infty y^2 \frac{\lambda^y e^{-\lambda}}{y!}$ which simplifies in a similar manner to $E(Y^2) = \lambda^2 + \lambda.$

Thus, $V(Y) = \lambda$.