9
$\begingroup$

The question I think is a simple one, but I've been unable to answer or find an answer for it yet:

There's a simple game: if you flip heads you win a dollar (from the house), but if you flip tails you lose a dollar (to the house).

If I start with n dollars (and the house has infinite money), how many flips can I expect to do before I've lost all my money? This is different than the common question of how many flips can I do before I have a run of length 'n'. In this case you can lose your money by never having a run of length more than 2, for example, simply by repeating win 1, lose 2, win 1, lose 2, etc...

I can write out a decision tree on this, but I haven't been able to generalize it into a formula yet.

  • 0
    @Dan: No - the probability of going bust eventually is $1$ (compared with $\frac{a}{a+b}$ if the bank has$a$finite amount) so the expected time to have an N% chance of going bust is finite, if N% < 100%.2012-07-17

5 Answers 5

1

Assuming this is a symmetric simple random walk (with probability of gaining $1$ unit fortune equal to $0.5$), where you start with a fortune of $n$, it is a simple matter of calculating the expected time of hitting the origin, $0$. That is the expected time to reach $0$.

Since I have assumed the random walk to be symmetric, the probability the random walk, say $S$, hits the point $0$ for the first time at the kth step, starting from n is same as the probability that the random walk hits $n$ for the first time, starting from $0$, in the kth step, under the assumption that the gambler is allowed to go in debt as well. (the assumption is made, NOT to the original problem, but the equivalent one I devised where the gambler starts from fortune = $0$ and has to reach a fortune $n$)

This probability is:

$f_{n}(k) = \frac{|k|}{n}\mathbb{P}(S_{k} = n)$

where $\mathbb{P}(S_{k} = n) = {k\choose \frac{1}{2}(k + n)}0.5^{k} $

With the probability mass function, $f_{n}(k)$ available, you can readily compute the expectation.

Source: Probability and Random Processes, by Geoffrey Grimmett and David Stirzaker (3ed)

Please correct me, if I made any errors in my reasoning.

EDIT: As mentioned by others, the expectation computed by the aforementioned method doesn't converge, i.e. goes to $\infty$. Still, I would like to see if my reasoning is correct!

Good question!

7

Here is a direct calculation: Let $f(n)$ be the expected length of the game when we start with $n$ dollars. Obviously $f(0)=0$, and for $n\ge 1$ we must have $ f(n) = 1 + \frac12 f(n-1) + \frac12 f(n+1)$ Thus, if $f(n+1)$ is infinite, $f(n)$ will also be infinite, since half of infinity is itself infinity, and adding $1+\frac 12 f(n-1)$ can only make that worse. So if $f(n)$ is infinite for any $n$ it must be infinite all the way down to $f(1)$.

Okay, let us assume that $f(1)$ is finite and see if we can reach a contradiction. The above formula can be rearranged (with $k=n+1$) to give $ f(k) = 2f(k-1) - f(k-2) - 2$ so knowing $f(0)$ and $f(1)$ will enable us to calculate $f(n)$ for any $n$. Let's set $f(1)=a$ and see where that gets us: $ f(0)=0 $ $ f(1)=a $ $ f(2)=2a-2 $ $ f(3)=3a-6 $ $ f(4)=4a-12 $ at which point we notice a pattern and conjecture $ f(n) = na - n(n-1) = n(a+1-n) $ This definition fits the recurrence (some tedious but straightforward algebra omitted here), so it must be right. On the other hand, it says that $f(n)$ becomes negative when $n>a+1$, so no matter what finite value we take $a$ to be, we get absurd results. The only way out is to conclude that $f(1)$ was not finite after all.

6

The expected length of the coin flip game is infinite, as shown as Corollary 1 in this package. If two people start with $m$ and $n$ coins, the expected length is $mn$

  • 0
    Thanks for all the answers from all. Very informative. I'm extremely surprised by this. I find it very surprising that the length is linear with respect to both $m$ and $n$ (but I believe it). With that said, I do plan to write a little program to test this. :-)2012-07-16
5

Here is a solution where one computes nothing.

Let us consider $t_k$ the expected number of flips until one is out of money when one's initial fortune is $k$ and let us compute $t_1$. After the first flip, either one is out of money or one's fortune is $2$, thus $t_1=\frac12(1)+\frac12(1+t_2)$. To compute $t_2$, note that, to be out of money when one's initial fortune is $2$, one needs to lose $1$, which takes in the mean $t_1$ flips, and then to lose $1$ again, which again takes in the mean $t_1$ flips. Thus $t_2=2t_1$ and $t_1=1+t_1$.

The equation $t_1=1+t_1$ has a unique solution, which is $t_1=+\infty$.

  • 0
    @copper.hat Whew. Relieved I am.2012-07-19
3

Your question can be described via a simple random walk on $\mathbb Z$: Choose i.i.d. random variables $X_1, X_2, \ldots$ such that

$P[X_i = +1] = P[X_i = -1] = \frac 12$

$X_i$ describes the outcome of the $i$-th toss of a coin: $+1$ means we win one dollar, $-1$ we lose one dollar. Furthermore, let $S_k = \sum_{i=1}^k X_i$ be the amount of money lost or won until time $k$.

Now, for a positive integer $n$ (which describes the initial amount of money we own), define $T_{-n} = \inf\{k\in \mathbb N\mid S_k = -n\}$ to be the number of coin tosses until we go bankrupt. Then we have

Lemma: $P[T_{-n} > k] = \Theta(1/\sqrt k) \qquad \text{ as }k\to \infty$

which is to say: The probability of not being bankrupt after the $k$-th toss of a coin decreases like $1/\sqrt{k}$ (in particular, this goes to zero; so we will go bankrupt, eventually). On the other hand the calculation $E[T_{-n}]= \sum_{k=1}^\infty P[T_{-n}>k] \approx \sum_{k=1}^\infty \frac{1}{\sqrt{k}} = \infty$ shows

Corollary: $E[T_{-n}] = \infty$

which says, we may well have to wait a very very long time before we go bankrupt.



In an attempt to be self-contained. Here is

Proof of the Lemma: The proof involves two steps:

Claim 1: We have $P[S_n = 2k-n] = \binom{n}{k}2^{-n}$ for $k\le n$, and $P[S_n = x] = 0$ for all other $x$.

and

Claim 2: For $k>0$ we have $P[T_{-n} \le k] = P[S_k \notin (-n, n]\,] $

For Claim 1 just note that in order for $S_n = 2k-n$, we need to win $k$ times and lose $n-k$ times and there are $\binom nk$ possibilites to choose $k$ winners out of $n$.

For Claim 2 write $P[T_{-n} \le k] = \sum_{b = -\infty}^\infty P[T_{-n} \le k, S_k = b]$ and notice that for $b > -n$ we have $P[T_{-n} \le k, S_k = b] = P[S_k = -2n - b]$ The last assertion is obtained by reflecting each path visiting $-n$ and ending at $b$ at the first time it hits $-n$. So that for the second part of its path (after first hitting $-n$), all the values of the $X_i = \pm 1$ get swapped to $X_i = \mp 1$. This gives a one-to-one correspondence between paths to $b>-n$, which visit $-n$ and paths to $-2n-b$. (I'm sure this explanation for Claim 2 is hardly understandable, but I can't come up with a better explanation... This is sometimes called the reflection principle). So

\begin{align} P[T_{-n} \le k] &= \sum_{b = -\infty}^\infty P[T_{-n} \le k, S_k = b] \\ &= \sum_{b\le -n} P[S_k = b] + \sum_{b> -n} P[S_k = -2n -b] \\ &= P[S_k = -n] + 2P[S_k < -n] \\[6pt] &= P[S_k \notin (-n,n]\, ] \end{align}

Therefore (even values of $k$ suffice by monotonicity)

\begin{align} P[T_{-n} > 2k] &\ge P[S_{2k} = 0\, ] = \binom{2k}{k} 2^{-2k} \sim \frac{1}{\sqrt{\pi k}} \\ P[T_{-n} > 2k] &\le 2n \cdot P[S_{2k} = 0\, ] = 2n\cdot \binom{2k}{k} 2^{-2k} \sim \frac{2n}{\sqrt{\pi k}} \end{align} q.e.d.