2
$\begingroup$

Let $X$ be a random variable denoting the number of times needed to roll ( including the last roll) a fair six-sided die until we obtain 4 consecutive six's. I would like help in computing $\mathbb{E}(X).$

I think I have to use the formula $$ \mathbb{E}(X) = \sum_{n=1}^\infty \mathbb{P}(X\geqslant n) $$ but I'm not sure how to compute $\mathbb{P}(X\geqslant n),$ so I'd like a little help.

  • 1
    Do you mean that it is *required* that you use the quoted formula? Or is it simply that you *think* the formula will be useful?2012-01-10
  • 0
    Well, I think this formula isn't essential here (however it may be useful). The hard thing is to compute $\mathbb{P}(X \ge n)$ or $\mathbb{P}(X \le n)$ or something similar. I can't think of something more simple than studying a few cases: the $(n-1)$th sequence of rolls ends with 0,1,2,3 consecutive sixs, which happens with certain probabilities (and there are no 4 consecutive sixs). What will be proper probabilities for the $n$th sequence?2012-01-10
  • 1
    It may help to look at a similar problem, solved here: http://math.stackexchange.com/questions/27989/time-until-a-consecutive-sequence-of-ones-in-a-random-bit-sequence/27991#279912012-01-10
  • 1
    Another way of writing **exactly** what @Sasha did is to let the expectation be $A$. Let $p=1/6$. If first toss is T (prob. $1-p$), expected wait is $1+A$, since we have wasted a toss. If first is H, then T (prob. $p(1-p)$), expectation is $2+A$. If we get HHT, (prob. $p^2(1-p)$. expectation is $3+A$. If get HHHT (prob. $p^3(1-p)$) expectation is $4+A$. Finally, if get HHHH (prob $p^4$) expectation is $4$. Thus $A=(1-p)(1+A)+p(1-p)(2+A)+\cdots+p^4(4)$. This is linear equation in $A$. Solve.2012-01-10
  • 0
    Another interesting way is to use Martingales to prove that $E[X]=6^4+6^3+6^2+6=1554$.2012-01-11

1 Answers 1

4

The system can be described by a 5 states Markov chain, with the state described by a number of consecutive six's accumulated so far. The transition matrix is: $$ P = \begin{bmatrix} 1-p & p & 0 & 0 & 0 \\ 1-p & 0 & p & 0 & 0 \\ 1-p & 0 & 0 & p & 0 \\ 1-p & 0 & 0 & 0 & p \\ 0 & 0 & 0 & 0 & 1 \end{bmatrix} $$ where for the case at hand $p=\frac{1}{6}$ is the probability to get a six at the next rolling of the die.

The classic way to solve this is to consider the expected number of rolls $k_i$ given an initial state $i$. We are interested in computing $k_0 = \mathbb{E}(X)$.

Conditioning on a first move, the following recurrence equation holds true: $$ k_i = 1 + p k_{i+1} + (1-p) k_0 \qquad \text{for} \qquad i=0,1,2,3 $$ with boundary condition $k_4 = 0$. Solving this linear system yields: $$ k_0 = \frac{p^3+p^2+p+1}{p^4} \quad k_1 = \frac{p^2+p+1}{p^4} \quad k_2 = \frac{p+1}{p^4} \quad k_3 = \frac{1}{p^4} \quad k_4 = 0 $$

In[24]:= Solve[{k0 == 1 + k1 p + (1 - p) k0,     k1 == 1 + k2 p + (1 - p) k0, k2 == 1 + p k3 + (1 - p) k0,     k3 == 1 + p k4 + (1 - p) k0, k4 == 0}, {k1, k2, k3, k0,     k4}] // Simplify  Out[24]= {{k1 -> (1 + p + p^2)/p^4, k2 -> (1 + p)/p^4, k3 -> 1/p^4,    k0 -> (1 + p + p^2 + p^3)/p^4, k4 -> 0}} 

Substituting $p=\frac{1}{6}$ gives $k_0 = \mathbb{E}(X) = 1554$.


Added A good reference on the subject is a book by J.R. Norris, "Markov chain" (Amazon). The chapter on discrete Markov chains is available on-line for free from the author. Section 1.3 discusses finding mean hitting times $k_i$.