8
$\begingroup$

I know that the expected value of a geometrically distributed random variable is $\frac1p$ but how do we get there. This is what I got so far: $\sum_{x=1}^\infty xP(X=x)$ where X is the number of failures until first success. Since it's geometric we have:$\begin{align} \sum_{x=1}^\infty xp(1-p)^{x-1}\\ \frac{p}{1-p} \sum_{x=1}^\infty x(1-p)^x\\ .... \end{align}$ How do we sum that?

  • 4
    Differentiate a well-known series.2012-11-12

2 Answers 2

11

Set $r=1-p$ and recall geometric series formula $ \sum\limits_{x=1}^\infty r^x=\frac{r}{1-r} $ Then $ \sum\limits_{x=1}^\infty x r^x= r\sum\limits_{x=1}^\infty x r^{x-1}= r\sum\limits_{x=1}^\infty \frac{d}{dr} r^x= r \frac{d}{dr}\sum\limits_{x=1}^\infty r^x= r\frac{d}{dr}\frac{r}{1-r} $ Is the rest clear?

  • 0
    @Oleg you can regard these equalities as r is a variable.2017-10-04
6

An experiment has probability of success $p\gt 0$, and probability of failure $1-p$. We repeat the experiment until the first success. Let $X$ be the total number of trials. We want $E(X)$.

Do the experiment once. So we have used $1$ trial. If this trial results in success (probability: $p$), then the expected number of further trials is $0$. If the first trial results in failure (probability: $1-p$), our experiment has been wasted, and the expected number of trials remains at $E(X)$. Thus $E(X)=1+(p)(0)+(1-p)E(X).$ If $E(X)$ exists, we can solve for $E(X)$ and obtain $E(X)=\dfrac{1}{p}$.

  • 0
    You my prefer to express the equation in my post as $E(X)=(p)(1)+(1-p)(1+E(X))$. Then the conditional expectation nature will be clearer. By the way, my $X$ is not the same as OP's, my $X$ is the number of trials until and including the first success. If we want the mean number of **failures** before first success, the answer is $\frac{q}{p}$.2013-12-18