25
$\begingroup$

This is a problem about the meeting time of several independent random walks on the lattice $\mathbb{Z}^1$:

Suppose there is a thief at the origin 0 and $N$ policemen at the point 2.

The thief and the policemen began their random walks independently at the same time following the same rule: move left or right both with probability 1/2.

Let $\tau_N$ denote the first time that some policeman meets the thief.

It's not hard to prove $E\tau_1=\infty$.

so what is the smallest $N$ such that $E\tau_N<\infty$?

I was shown this problem on my undergraduate course on Markov chains, but my teacher did not tell me the solution. Does anyone know the solution or references to the problem?

  • 0
    Crossposted to MO: http://mathoverflow.net/questions/106133/random-walk-police-catching-the-thief2012-09-02

2 Answers 2

7

First some simplifications. Pretend that the origin moves with the thief; then we can say that the thief always stands still, while each policeman (rather, each police officer) moves 2 spots left with probability $1/4$, moves 2 spots right with probability $1/4$, and stands still with probability $1/2$. Now we might as well scale down by a factor of 2, so that the moves to the left and right are 1 spot each (and the officers start at 1 instead of 2, though that won't affect whether the expectation is infinite). Finally, if you like, ignore the possibility of standing still and just let each officer random walk on $\mathbb Z$ with probabilities $1/2$ to the left and right: the standing still just multiplies the expectation by $2$, the expected time it takes to move a spot.

So we've reduced to the following problem: $N$ police officers do a standard random walk on $\mathbb Z$, starting at 1. Is the expectation of the time it takes for one of them to hit the origin finite or infinite?

A general identity will be useful here. Let $X_k$ be a sequence of independent coin flips (not necessarily fair or identical coins) with possible outcomes S (Success) or F (Failure). Let $FS(n)$ be the probability that the $n$th flip is the First Success (that is, $X_1=\cdots=X_{n-1}={}$F and $X_n={}$S). Let $NSY(n)$ be the probability that there have been No Successes Yet by the $n$th step (that is, $X_1=\cdots=X_{n}={}$F). Then the expectation of the first success is $ \sum_{n=0}^\infty n FS(n) = \sum_{n=0}^\infty n \big( NSY(n-1)-NSY(n) \big) = \sum_{m=0}^\infty NSY(m) $ after telescoping the sum.

Now I believe (but someone should verify) that in a standard random walk on $\mathbb Z$ starting at 1, the probability that 0 has not been hit yet by the $n$th step is asymptotically proportional to $1/\sqrt n$. (This probably has to do with Catalan numbers.) This is one way of seeing that $E\tau_1$ is infinite: it's summing the infinite series $NSY(m)$ where each term is essentially a constant times $1/\sqrt m$.

In the $N$-officer case, the No Success Yet probability is just the $N$th power of the No Success Yet probability when $N=1$. Therefore for $N=2$, the series will still (barely) diverge, but it will converge for $N=3$; in other words, $E\tau_2 = \infty$ but $E\tau_3<\infty$.

  • 0
    Aha. For example, it's impossible (in the post-first-simplification description) for one officer to move 2 spots left while another moves 2 spots right. Not independent - good point.2012-08-30
1

(I do not have enough reputation to leave comments, so I am writing in a new post) Using Greg Martin's argument and an additional coupling argument (using the fact that at any time $n$, with probability $>1/2$, the random walk corresponding to the thief is at a nonnegative point at time $n$) you can show that $\mathbb{E}\tau_2 = +\infty$ as well.

His comment about the Catalan numbers is also correct; the only way to drop below $0$ for the first time on step $2j+1$ is to have a Dyck path of length $2j$ followed by a downward jump; this means that the corresponding probability is about $Cj^{-3/2}$, and summing over $j$ from $1$ to $n$ gives $n^{-1/2}$.

I don't yet know how to show that $\mathbb{E}\tau_3<+\infty$, though.