4
$\begingroup$

A die (with six sides) is rolled repeatedly and summed. Show that the expected number of rolls until the sum is a multiple of $n$ is $n$.


In

http://www.madandmoonly.com/doctormatt/mathematics/dice1.pdf

problem 12 page 15 a proof is given, assuming that the distribution for the sum of dice values for n > 6 is uniform and equals $\frac{2}{7}$ which is a false assumption that should not be used in a rigorous proof.

  • 0
    Please note that I explicitly say in that pdf that I am not presenting a rigorous proof.2012-03-28

1 Answers 1

10

If $X_m$ is the sum after $m$ rolls, then the process $Y_m=X_m\pmod n$ taking values in $\{0,1,\dots,n-1\}$ is a random walk, considering the state space as points on the circle. By symmetry, its unique invariant probability distribution $\pi$ is uniform over the $n$ states. Therefore the expected time to return to the starting point $0$ is $1/\pi(0)=n$. Since your sum began at $0$, this gives the result.

By the way, this argument works for any (fixed) fair or loaded die with any number of sides. It could even be a one-sided "die", and numbers on it could be positive, negative, and/or zero! I only insist that it is possible to get from any state to any other state. That is, the Markov chain is irreducible.

To prove such results fully, you need the theory of Markov chains on a finite state space. In particular, the topic of "return times". Take a look at the book Markov Chains and Mixing Times by Levin, Peres, and Wilmer, freely available here. Proposition 1.14 (ii) on page 12 gives what you want. Section 1.4 of Introduction to Stochastic Processes (2nd edition) by Gregory F. Lawler also has a good explanation.


Here is an intuitive picture. The Markov chain starts over whenever it hits the state $0$. Writing "$z$" when the chain is in state zero and "$y$" otherwise, a typical outcome looks like

$zyyyyyyyyyyyzyyzzyyyyyyyyyzyyyyzyyyyyyyyyyyyyyyyyyzy\dots$

where we have joined together a sequence of independent finite strings, like $zyyyyyyyyyyy$, of random length. Each finite string starts with one $z$ and has zero or more $y$s following.

The fact that $\pi(z)=1/n$ means that over a very long number of trials, the average proportion of $z$s in the outcome is about $1/n$. To make that work, the average distance between the $z$s must be $n$.

  • 0
    There is still something not clear. You are talking about a process $Y_n$ defined by $Y_n=X_n\pmod n$. For me this $Y_n$ such as you defined it, is not a process but just a number that has a probability distribution. Or do you mean a process $Y_m=X_m\pmod n$ where m is not necessarily equal to n?2012-01-31
  • 0
    @wnvl Yes, I was overloading my variables. Corrected.2012-01-31
  • 0
    You say "By symmetry, its unique invariant probability distribution π is uniform over the n states." I can imagine that after infinitely many throws the probability distribution of $Y_m=X_m\pmod n$ will be uniform over the n states. But certainly not in the beginning. Assume n>6, then after the first throw the distribution will not be uniform at all. If the process would start from a random location after already infinitely many throws then I can understand yor reasoning about unique invariant probability distribution π over the n states... TBC2012-01-31
  • 1
    ... In this case however we are sure that we have to start from 0 and if n>>6 we need allready a lot of throws to be in the possibility to have a multiple of n. Would the answer to the question have been different if we started from a random number modulo n?2012-01-31
  • 0
    @wnvl The initial state doesn't enter into the definition of invariant distribution. I do not claim that the distribution will be uniform after one throw. I'm sorry: I don't see the difficulty. Anyway, the result is true!2012-01-31
  • 0
    @wnvl If you ask about returning to the initial state, then the answer is $n$ no matter where you start, or even if the initial state is random. If you ask how long on average it takes to hit state $y$ starting at state $x$, this is a harder problem.2012-01-31
  • 0
    @Byron: Perhaps you could be more explicit about the results you are using. I can find some plausible statements that you might have meant, but I haven't seem them proven anywhere. In particular, our undergraduate probability course could ask such a question, but certainly hasn't covered material that would allow understanding your answer.2012-01-31
  • 0
    @JackSchmidt OK, I will add to my answer.2012-01-31
  • 0
    Sorry, I assumed invariant distribution meant something else (I am not a specialist, but was just intrigued by this question.). I looked it up in more detail and now I understand your reasoning. Do you have a link or a small hint to a prove of the theorem that proves that when you have an invariant probability distribution π that is uniform over n states, that the expected time to return to the starting point is n. This is the next step to understand the answer completely....2012-01-31
  • 0
    ... I assume the theorem in general will require some extra restrictions on the probability transition such that there are no subgroups.2012-01-31
  • 0
    @wnvl I have added to my answer. If anything is unclear, let me know.2012-01-31
  • 0
    @wnvl Yes, we could get stuck in a subgroup. So I assume that it is possible to travel between any two states. In Markov chain lingo this says the chain is "irreducible".2012-01-31
  • 0
    @Byron: Thanks, that is perfect for me.2012-01-31