2
$\begingroup$

Considering the standard geometric coin toss, let $N$ be the number of tosses until the first heads appears. The probability of getting heads is $p$. I understand how to find $E[N]$ by conditioning on the value of the first toss, but I'm having a hard time generalizing this process to a function of a random variable. I know how to get the answer $\frac{2-p}{p^2}$ using sums but I can't figure out how to do it using conditional expectation.

For $E[N]$, I let $Y=0$ is tails on the first toss and $Y=1$ if heads. Then $E[N|Y=1]=1$ and $E[N|Y=0]=1+E[N]$ which gives that $E[N]=E[E[N|Y]]=\frac{1}{p}$. What do I do to find $E[N^2]$ this way?

2 Answers 2

1

The only thing that changes is $\mathbb{E}(N^2 | Y=0) = \mathbb{E}((1+N)^2) = \mathbb{E}(N^2) + 1 + \frac{2}{p}$.

Thus $ \mathbb{E}(N) = \mathbb{E}( \mathbb{E}(N | Y)) = p \cdot 1 + (1-p)( 1+ \frac{2}{p} + \mathbb{E}(N^2) ) \implies \mathbb{E}(N^2) = \frac{2-p}{p^2} $

  • 0
    I was playing around with squaring different things and I wasn't able to figure out what exactly had to be done. Thanks. That makes a lot of sense. My thoughts were along the lines of $1+E[N]^2$ but that obviously wasn't working.2011-10-05
1

It may help to view the distribution of $N$ as the unique solution of a fixed-point distributional equation. Namely, $N=1$ if $Y=1$ and N=1+N' with N' distributed like $N$ if $Y=0$. Hence $N$ solves the equation $ N\stackrel{(d)}{=}1+\mathbf 1_AN\qquad\qquad (*) $ where $\mathrm P(A)=1-p$ and $A$ and $N$ are independent.

First application: Integrating $(*)$ yields $\mathrm E(N)=1+(1-p)\mathrm E(N)$ hence $\mathrm E(N)=1/p$.

Second application: Integrating $(*)$ raised to the square yields $ \mathrm E(N^2)=1+2(1-p)\mathrm E(N)+(1-p)\mathrm E(N^2), $ hence $p\mathrm E(N^2)=1+2(1-p)/p$ and $\mathrm E(N^2)=(2-p)/p^{2}$.

Third application: For every positive integer $k$, $\mathrm E((N-1)^k)=(1-p)\mathrm E(N^k)$, which allows to compute recursively every moment $\mathrm E(N^k)$.

Fourth application: For every $|s|<1$, $\mathrm E(s^N)=s(p+(1-p)\mathrm E(s^N))$ hence $ \mathrm E(s^N)=ps/(1-(1-p)s). $ Differentiating $k$ times $\mathrm E(s^{N-1})$ at $s=1$ yields for every $k\geqslant0$ that $ \mathrm E((N-1)(N-2)\cdots(N-k))=k!(1-p)^k/p^k, $ which also allows to compute recursively every moment $\mathrm E(N^k)$.