2
$\begingroup$

For a geometric random variable X, $P_X(k) = (1-p)^{k-1}p$, then $E[X] = \sum_{k=1}^{\infty}k(1-p)^{k-1}p\;.$ That is ($q = 1 -p$) $E[X] = p + 2qp + 3q^2p + 4q^3p + \dotso $

But $P_{X|X>1}(k) = \begin{cases}(1-p)^{k-2}p\text{ if }k > 1\\ 0\text{ if }k = 1\end{cases}$ $E[X|X>1] = \sum_{k=2}^{\infty}k(1-p)^{k-2}p\;,$ that is $E[X|X>1] + 1 = 1 + 2p + 3qp + 4q^2p + \dotso$

So how could $E[X|X>1] = E[X] + 1$?

Thanks.

  • 0
    Like @Henry said. Furthermore, for every function $u$ and every nonnegative $n$, E(u(X)\mid X>n)=E(u(X+n)).2011-09-09

3 Answers 3

6

Since $P_X(k)$ is normalized, we have

$\sum_{k=1}^\infty(1-p)^{k-1}p=\sum_{k=2}^\infty(1-p)^{k-2}p=1\;.$

Thus

$ \begin{eqnarray} E[X|X>1] &=& \sum_{k=2}^{\infty}k(1-p)^{k-2}p \\ &=& \sum_{k=2}^{\infty}(k-1)(1-p)^{k-2}p+\sum_{k=2}^{\infty}(1-p)^{k-2}p \\ &=& \sum_{k=1}^{\infty}k(1-p)^{k-1}p+1 \\ &=& E[X] + 1 \end{eqnarray} $

All this is really saying is that since the conditional probability for $k+1$ is the same as the unconditional probability for $k$, the conditional expectation value of $k$ must be the unconditional expectation value of $k+1$.

5

A coin has probability $p$ of landing heads, and $q=1-p$ of landing tails. Assume that $p\ne 0$.

Let $X$ be the total number of tosses until you get a head. Then $X$ has precisely the geometric distribution that you described. One can, as you did, get an expression for $E(X)$ as an infinite series. In fact, it turns out $E(X)=1/p$. But we need neither the series nor its sum to prove the result that is asked for.

Suppose that we are given that $X>1$. This means that our first toss was a tail. Let $Y$ be the additional number of tosses that we must wait for a head. The coin does not remember that the first toss was a tail, so $Y$ has the same distribution, and therefore the same mean, as $X$. In symbols, $E(Y)=E(X)$.

But the total number of tosses, given that $X>1$, is $1+Y$. The $1$ is for the "wasted" first toss. Thus $E(X|X>1)=E(1+Y)=1+E(Y)=1+E(X).$

Comment: If you prove the result using the infinite series, you know that the result is true. If you do it more conceptually, you know why the result is true.

  • 0
    Great answer! which in addition gives $E[X]$ without needing to sum a series, for we have that $E[X\mid X = 1] = 1$, E[X \mid X > 1] = 1 + E[X], and since the events $\{X = 1\}$ and \{X > 1\} have probabilities $p$ and $1-p$, we get $E[X] = p + (1 + E[X])(1-p) = 1 + (1-p)E[X]$, that is, $E[X] = 1/p$.2011-09-24
0

The "memorylessness" of the geometric distribution implies that the conditional probability distribution of $X$ given that $X\ge\text{any particular integer}$ is the same as the probability distribution of $X+\text{that same integer}$.

  • 0
    Me? I am not doing anything except reading what the OP wrote. (But what about the first sentence of my previous comment?)2011-08-30