5
$\begingroup$

Here's the description for Banach's matchbox problem from Concrete Mathematics EXERCISE 8.46 (edited)

Stefan Banach used to carry two boxes of matches, one containing $m$ matches and the other one containing $n$ matches. Whenever he needed a light he chose a box at random, each with probability $\frac 1 2$, independent of his previous choices. After taking out a match he'd put the box back in its pocket (even if the box became empty -- all famous mathematicians used to do this). When his chosen box was empty he'd throw it away and reach for the other box.

  • Once he found that the other box was empty too. What's the probability $p_{m,n}$ that this occurs?
  • Generalizing your answer to part (a), find a closed form for the probability $p_{k,m,n}$ that exactly $k$ matches are in the other box when an empty one is first thrown away.

The answer to these two problems only show two generating functions and the answer:

(a) \[ P(w,z) = 1 + \frac 1 2 (wP(w,z) + zP(w,z)) = (1 - \frac 1 2 (w+z))^{-1} \tag 1 \] hence \[ p_{m,n} = 2^{-m-n} \binom {m+n} n \] (b) \[ P_k(w,z) = \frac 1 2 (w^k+z^k)P(w,z) \tag 2 \] hence \[ p_{k,m,n} = 2^{k-1-m-n} \left( \binom{m+n-k}m + \binom{m+n-k}n \right) \]

where $P(w,z) = \sum_{m,n \ge 0} p_{m,n}w^mz^n$ and $P_k(w,z) = \sum_{m,n \ge 0} p_{k,m,n}w^mz^n$ is generating functions for $p_{m,n}$ and $p_{k,m,n}$.

I'm confused that the two equations for generating functions come out first then we get the closed form. I don't know how they come directly.

There might be a reasonable explanation for formula (1):

We have $p_{m,n} = \frac 1 2 (p_{m-1,n} + p_{m,n-1}) + [m=n=0]$ for all integers $m,n$ where $p_{m,n} = 0$ whenever $m<0$ or $n<0$. $[P]$ is Iverson bracket. Multiplying $z^mw^n$, summing up over all integers $m,n$ we get equation (1), but equation (2) lacks evidence. Although we can interpret $p_{k,m,n} = \frac 1 2 (p_{m-k,n} + p_{m,n-k})$ combinatorially, we cannot explain why the generating function arises, because the preceding formula gives the closed-form directly, without generating function.

So I guess that there's a direct way to obtain equation (1) and (2), without the assistant of recurrence relations, just like section 7.1 DOMINO THEORY AND CHANGE, or section 8.4 FLIPPING COINS.

Any help? Thanks!

Postscript: There's a combinatorial proof for (b). Mention that (a) is a special case of (b). $\newcommand{\length}{\mathrm{length}}$

First, we define the probablility space for the matchbox problem. We use symbol $L$ indicating fetching a match from the left matchbox, and symbol $R$ from the right one. The elementary events are defined as strings of $L$ and $R$ which ends at an attempt of fetching the match from the empty matchbox. For example, when $m=2$ and $n=1$, we have the probability space $\Omega = \{LLL, LLRL, LLRR, LRLL, LRLR, LRR, RLLL, RLLR, RLR, RR\}$. The elementary event $LLRR$ means that the 1st and 2nd matches are from the left matchbox, and the 3rd is from the right one. The 4th time is trying to fetch the match from the right matchbox, but it's empty. $\length(\omega)$ indicates the length of elementary event $\omega$. We have $\Pr(\omega) = 2^{-\length(\omega)}$.

Now let's try to enumerate all elementary events for the problem (b). There're two cases: Case 1, the right matchbox remains $k$ matches. We use $\Omega_1$ to indicate such elementary event. For each $\omega \in \Omega_1$, there're $m+1$ $L$'s in $\omega$, and $n-k$ $R$'s in it, and the last symbol of $\omega$ is $L$. For each $\omega \in \Omega_1$, $\Pr(\omega) = 2^{\length(\omega)} = 2^{-(m+n-k+1)}$, and $|\Omega_1| = \dbinom{m+n-k}m$. Case 2, the left matchbox remains $k$ matches, saying $\Omega_2$. We have $|\Omega_2| = \dbinom{m+n-k}n$, and for each $\omega \in \Omega_2$, we have $\Pr(\omega) = 2^{\length(\omega)} = 2^{-(m+n-k+1)}$.

Thus \begin{align*} p_{k,m,n} &= \sum_{\omega \in \Omega_1 \cup \Omega_2} \Pr(\omega) \\ &= \sum_{\omega \in \Omega_1} \Pr(\omega) + \sum_{\omega \in \Omega_2} \Pr(\omega) \\ &= 2^{-(m+n-k+1)} \left( \binom{m+n-k}{m} + \binom{m+n-k}{n} \right) \end{align*}

2 Answers 2

1

Just like you did for the first part, by conditioning on the first match removed, we get the recurrence $p_{k,m,n} = \frac{1}{2} p_{k,m-1, n} + \frac{1}{2} p_{k,m,n-1} + \frac{1}{2}[m=k, n=0 \; \text{or} \; m=0, n=k]$where $p_{k,i,j}$ is 0 if $i<0$, $j<0$, or both $i,j. Why the factor of 1/2? When you're at exactly $0$ and $k$, you need pick next from the box with 0 matches, which happens half the time.

When we translate the recurrence into a power series, we get $P_k(w,z) = \frac{w+z}{2}P_k(w,z) + \frac{1}{2}(w^k + z^k)$So $P_k(w,z) = \frac{1}{2} \frac{w^k+z^k}{1 - \frac{w+z}{2}}$where before $P = \frac{1}{1 - \frac{w+z}{2}}$

  • 0
    Alright. I'm looking for anybody else interpreting the equation (2) directly, other wise I'll take your answer.2012-06-05
0

Lol not quite anybody else, but here goes: interpret the problem as a random walk on the lattice $\mathbb{Z}^2$, where you start at $(m,n)$, and at each step move either one down or one left.

The problem is then: what are the odds you hit either $(k,0)$ or $(0,k)$, times $\frac{1}{2}$, the half being there for the same reason as in the recurrence above.

By the nature of the walk, these are disjoint events. So our desired probability is sum of the probabilities of hitting $(k,0)$ and hitting $(0,k)$ (times $\frac{1}{2}$ still).

Now I claim the odds of hitting $(k,0)$ are exactly $p_{m-k,n}$. But indeed this is clear, shift the starting point and desired hitting point down by $k$, you have the first problem.

  • 0
    haha yeah word it's same the idea as in your comments, just applied to the second problem. I literally came on to mention you could do that. nice man2012-06-05