20
$\begingroup$

Possible Duplicate:
Probability of a random binary string containing a long run of 1s?

EDIT: Cocopuffs below has partially answered the question, but the critical base case $L=2$ to his inductive argument is missing and it's not obvious how to fill the gap.

This original problem is described in this thread at community.wizards.com.

It originated as a question about a Magic: the Gathering scenario, but I will rephrase it here in general audience terms.

Suppose you have \$0, and your friend has \$L. You start flipping coins. Every time you flip heads, you win \$2 and your friend wins \$1. Every time you flip tails, you lose all of your money. What is the probability you will eventually have (at least) as much money as your friend?

As outlined in the thread above, the probability $p(L)$ of you eventually matching your friend is given by the recurrence relation

$$p(L) = \frac{1}{2^{L-1}} + \sum_{i=1}^{L-1} \frac{p(L+i)}{2^i}.$$

Other than the trivial observation that $p(L) = 1$ is a solution to this recurrence, I haven't been able to make headway. How do I prove this solution is unique (and therefore the right one)?

  • 0
    The description seems incomplete to me. If $d(n)$ is the number of dollars you got at step $n$ and $d'(n)$ is your friend's then is the probability you are searching for $P(\exists n, \ d(n)>d'(n))$ ?2012-09-11
  • 0
    $P(\exists n, d(n) \geq d'(n) )$. Right, the probability that you have at least as much money as your friend, at the same point in time.2012-09-11
  • 0
    So flipping a tails works as well because then you both have 0 right?2012-09-11
  • 1
    No, on tails you lose all of your money, but your friend does not.2012-09-11
  • 0
    But then, after you lose your money, it starts over and you may start gaining again?2013-01-21
  • 0
    Yes. Although now $L$ is greater, of course.2013-01-21
  • 0
    So when you say “Every time you flip tails, you lose all of your money” you mean “Every time you flip tails, all your money goes to your friend”, right ?2013-01-22
  • 0
    No. Let's say that before flip $i$, my friend has $L_i$ dollars and I have $M_i$ dollars. Then if I flip heads, $L_{i+1} = L_i + 1, M_{i+1} = M_i + 2$. If I flip tails, $L_{i+1} = L_i, M_{i+1}=0.$2013-01-22
  • 0
    @user7530 : then I don't understand why you said in your comment in answer to GEdgar that “now $L$ is greater”. You wrote $L_{i+1}=L_i$, which means that $L$ is the same, not greater.2013-01-22
  • 0
    Ah, sorry, what I meant was that $L_{i+1} \geq L_0$, that is, the amount of money my friend possesses does not "start over."2013-01-22
  • 2
    Note that the accepted answer to the other question gives an exact infinite series solution, and it's of the sort that's unlikely to have a nice closed form, as $p(L)$ is a [Liouville number](http://en.wikipedia.org/wiki/Liouville_number) in all nontrivial cases.2013-01-23

4 Answers 4

8

The solution is not the right one; in fact, you have a vanishingly small probability of ever catching your friend as $L$ becomes large.

Let $X_{i}$ for $i=1,2,3,\ldots$ be i.i.d. discrete random variables representing the lengths of your non-zero runs of heads. (Note that there are a.s. infinitely many such runs.) The probability distribution for each variable is $P[X=k]=2^{-k}$ for $k\ge 1$. So $P[X \ge k]=2^{1-k},$ and $P[X. After your $n$-th non-zero run of heads, you have $2X_n$ and your friend has $L+\sum_{i=1}^{n}X_{i}$; you have as much money as your friend if $$ X_{n}\ge L + \sum_{i=1}^{n-1}X_{i}, $$ which implies that $X_{n} \ge L + (n-1)$. That is, each time you flip at least one head and fail to catch your friend, your friend has at least one more dollar for you to match next time. Therefore, the probability of success is at most $$ P[X \ge L] + P[X < L]P[X \ge L+1]+P[X$L=0$ or $L=1$, the probability of success is obviously $1$. The above bound shows that the probability of success for $L=2$ is at most $2/3$, and that it decreases exponentially with $L$.

6

Edit: This is my (incorrect) attempt at a solution. The induction step is not valid at $L=2$.

Clearly $p(0) = 1$.

Use induction on $L$ and assume that $p(0),...,p(L-1) = 1$.

Case 1: $L$ is odd, $L = 2m-1.$ Then $$1 = p(m) = \frac{1}{2^{m-1}} + \sum_{i=1}^{m-1} \frac{p(m+i)}{2^i} = \frac{p(L)}{2^{m-1}} + \frac{1}{2^{m-1}} + \underbrace{\sum_{i=1}^{m-2} \frac{1}{2^i}}_{1 - 2^{2-m}}$$ and $p(L) = 1$ follows after multiplying through.

Case 2: $L = 2m-2$ is even. Then $$1 = p(m) = \frac{1}{2^{m-1}} + \sum_{i=1}^{m-1} \frac{p(m+i)}{2^i} = \frac{p(L)}{2^{m-2}} + \frac{p(L+1)}{2^{m-1}} + \frac{1}{2^{m-1}} + (1 - 2^{3-m})$$ and so $$2p(L) + p(L+1) = 3.$$ On the other hand, the inequalities $0 \le p(L), p(L+1) \le 1$ imply immediately $p(L) = 1$.

  • 0
    Doesn't the induction fail for L=2?2012-09-11
  • 0
    @user7530 Good point. It fails for $m=1$ as well, but $p(1) = 1$ is clear. Let me change something.2012-09-11
  • 2
    @user7530 This seems to be a problem. The induction should start at $L=2$ then and showing $p(2) = 1$ seems difficult.2012-09-11
5

Define $Q(L) = P(L)/2^{L+1}$, so that your recurrence relation becomes : $Q(L) = 4^{-L} + \sum_{i=1}^{L-1} Q(L+i)$.

Then you get $Q(L+1) - Q(L) = (1-4).4^{-L-1} + Q(2L) + Q(2L+1) - Q(L+1)$, thus $2Q(L+1) - Q(L) = -3.4^{-L-1} + Q(2L) + Q(2L+1)$. Together with $Q(2) = 4^{-2} + Q(3)$, we have an equivalent system.

The space of solutions to this system of equations is a real affine space of infinite dimension : you can simply choose arbitrarily the values of every $Q(2L)$, then deduce the values of the $Q(2L+1)$. However there might not be many solutions with $Q(L) \in [0 ; 2^{-L-1}]$, and you are looking for the smallest one :

supposing you have a positive solution, $Q(2) = 4^{-2} + Q(3) \\ = 4^{-2} + 4^{-3} + Q(4) + Q(5) \\ = 4^{-2} + 4^{-3} + 4^{-4} + 2Q(5) + Q(6) + Q(7) \\ \ldots \\ \ge 4^{-2} + 4^{-3} + 4^{-4} + 2.4^{-5} + 3.4^{-6} + 6.4^{-7} + 11.4^{-8} + 22.4^{-9} + 42.4^{-10} + \ldots \\ = \sum a_2(n)4^{-n} = Q_0(2)$

We can do the same for higher $L$ and define a series $Q_0(L) = \sum a_L(n)4^{-n}$. Since we have a positive solution $Q_1(L) = 2^{-1-L}$, those series converge and $0 < Q_0(L) \le Q_1(L)$. The sequence $Q_0(L)$ is a solution of the system of equations, and it is in fact strictly smaller than the trivial solution $Q_1(L)$

After observing that the coefficients don't grow too fast, we can give a better bound for $Q_0(L)$ : for every $L$ we have for $n$ large enough $a_L(2n+1) = 2a_L(2n)$ and $a_L(2n) = 2a_L(2n-1)-a_L(n) < 2a_L(2n-1)$. For example with $L=2$, $Q_0(2) < 4^{-2} + 4^{-3} + 4^{-4} + 2.4^{-5} + 4.4^{-6} + 8.4^{-7} \ldots = 11.2^{-7} < 2^{-3} = Q_1(2)$, and thus $P_0(2) < 11/16$

To see that you're indeed looking for $P_0$, you just need to check that the probability to get as much money as your friend is the limit as $n \mapsto \infty$ of the probability of doing so in less than $n$ throws, which gives rises to essentially the same series.

The problem of giving a closed form expression for $P_0(2)$ and the others $P_0(L)$ is still unresolved.

5

To add to mjqxxxx's answer:

The probability of failure $P_f(L)=P(X_1 can be bounded by the other side by

$$P_f(L) \le P(X_1

(the inequality sign comes from the observation that if were to add the chained conditioning to get the exact probabilities, each probability would decrease). But

$$P(X_n < L + X_1 + X_2 + \cdots + X_{n-1})=\\ = \sum_{X_1=1}^\infty 2^{-X_{1}} \sum_{X_2=1}^\infty 2^{-X_{2}} \cdots \ \sum_{X_{n-1}=1}^\infty 2^{-X_{n-1}}\sum_{X_{n}=1}^{L+X_1+\cdots X_{n-1}-1} 2^{-X_n}=\\ = 1 - 2^{-L+1} (1/3)^{n-1}$$

Hence

$$P_f(L) \le \prod_{k=0}^{\infty } \left(1- 2^{-L+1} (1/3)^{n}\right)=(a;q)_{\infty}$$

where $(a;q)_{\infty}$ is the q-Pochhammer symbol, with $a=2^{-L+1}$, $q=1/3$

Some values $P_f(2) \le 0.3826$ , $P_f(3) \le 0.659$, $P_f(4) \le 0.82116$ In particular, the probability of success for $L=2$ is greater that $0.61740$ (and, according to mjqxxxx's answer, less than $0.66666$)