4
$\begingroup$

Let $S_0 = 0$ and define $S_n = \sum^n_{i = 1} X_i$ such that \begin{align*} \mathbb P(X_i = 1) &= p \\ \mathbb P(X_i = -1) &= 1 - p = q \end{align*}

for $p < \frac{1}{2}$. Find the distribution of $Y = \max \{S_0, S_1, S_2, ...\}$.

My attempt at a solution: One known result (and a nice application of path counting/the reflection principle) is that if $Y_n = \max \{S_0, S_1, ..., S_n\}$ then \begin{equation*} \mathbb P(Y_n \geq r, S_n = b) = \begin{cases} \mathbb P(S_n = b) & b \geq r \\ \left(\frac{q}{p}\right)^{r - b} \mathbb P(S_n = 2r - b) & b < r \end{cases} \end{equation*}

and so, for $r \geq 1$, we find \begin{align*} \mathbb P(Y_n \geq r) &= \mathbb P(S_n \geq r) + \sum^{r - 1}_{b = -\infty} \left(\frac{q}{p}\right)^{r-b} \mathbb P(S_n = 2r - b) \\ &= \mathbb P(S_n = r) + \sum^\infty_{c = r + 1} \left[1 + \left(\frac{q}{p}\right)^{c - r}\right] \mathbb P(S_n = c) \end{align*}

However, this was for the maximum over a finite random walk $S_n = \sum^n_{i = 1} X_i$. In the present case we're interested in the maximum over all $n \in \mathbb N$, say $S_\infty$, and it's not immediately obvious to me how to solve such a case.

Thank you for any input!

  • 0
    Hint: Define a function $f(n):= \big( \frac{1-p}{p} \big)^n$. Then $f$ is a harmonic function for this random walk. In other words, $f(n) = pf(n+1)+(1-p)f(n-1)$. Therefore $f(S_n)$ is a martingale.2017-02-17
  • 0
    Working through a bit of computation, I don't quite see how having $f(S_n)$ a martingale would help in our goal of finding the distribution of $Y = \max \{S_0, S_1, ...\}$? If I can show that $f(S_n)$ is uniformly integrable then we may use Martingale Convergence to yield $\mathbb E[f(S_\infty)] = \mathbb E[f(S_0)] = f(S_0) = f(0) = 1$, but I don't quite see how this path would help in finding $Y$.2017-02-17
  • 0
    Suppose that we stop $S_n$ at the first hitting time of some fixed positive integer $N \geq 0$. Then the stopped martingale will be UI. And it can only approach two possible values as $n \to \infty$. What values are those?2017-02-17
  • 0
    Okay I posted an answer which had incorrect notation at first, and I edited it so it should be good now.2017-02-17

2 Answers 2

6

Define a function $f(n):= \big( \frac{1-p}{p} \big)^n$. Then $f$ is a harmonic function for this random walk. In other words, $f(n) = pf(n+1)+(1-p)f(n-1)$. Therefore $Y_n:=f(S_n)$ is a martingale.

For the rest of the problem, fix an integer $N \geq 0$. We will use the martingale property to compute the probability $P(\sup_n S_n \geq N)$.

Define $T:= \inf\{n \geq 0: S_n = N\}$. Then the stopped martingale $Y^T_n:=Y_{T\wedge n}$ is a bounded martingale (thus uniformly integrable).

Now there are two possibilities: if $S_n \to -\infty$ and $S_n < N$ for all $n$, then we see that $Y_{T\wedge n} \to 0$ as $n \to \infty$. On the other hand $Y_{T \wedge n} \to \big(\frac{1-p}{p} \big)^N$ if $S_n \geq N$ for some $n$. Therefore, by the optional stopping theorem we see that $$1 = E[Y^T_{\infty}] = E\bigg[ \bigg(\frac{1-p}{p} \bigg)^N \cdot 1_{\{T<\infty\}} \bigg] + E[0 \cdot 1_{\{T=\infty\}}] = \bigg(\frac{1-p}{p}\bigg)^NP(T<\infty) $$Thus $P(\sup_n S_n \geq N) =P(T<\infty)= \big( \frac{1-p}{p} \big)^{-N}$.

  • 1
    Fantastic solution. Clear and concise. Thank you.2017-02-17
  • 0
    Maybe it's just late and so I can't see why this should be so. Is it obvious why $\mathbb P(\sup_n S_n \geq N) = \mathbb P( T < \infty) \equiv \mathbb P(\inf\{n ~:~ S_n = N\} < \infty)$?2017-02-17
  • 0
    After some thought I do indeed see they should be identical.2017-02-17
  • 0
    Yet another question: In applying the Optional Stopping Theorem we require that our stopping time be almost surely bounded so that $\mathbb P(T < \infty) = 1$. I don't quite see how we get this up to the point where we wish to apply the Optional Stopping Theorem. In particular, in the case where $S_n \to -\infty$ and $S_n < N$, we see that such an infinite stopping time may haunt us.2017-02-17
  • 1
    @JonMann For uniformly integrable martingales we can relax conditions under which the optional stopping theorem will hold. In particular, for UI martingales the optional stopping theorem works even if the stopping time is infinite (which is what I have done here). For a proof of this result, visit your favorite book (for instance Revuz and Yor, Rogers and Williams, maybe Karatzas, etc).2017-02-17
  • 0
    You can also see the wikipedia article. I have used version (c) stated there: https://en.wikipedia.org/wiki/Optional_stopping_theorem2017-02-17
4

It is possible to find the distribution of $M := \max_{n\ge 0} S_n$ without using any involved techniques.

Upon reaching a level $x\ge 0$ for the first time, the random walk has two choices

  • reach the level $x+1$ sometime;
  • drift to $-\infty$.

(Thanks to LLN, it will make the second choice eventually.) By the strong Markov property and homogeneity, these choices are independent for $x=0,1,\dots,M$, and the probability of the first choice is the same and equal to $$ p' = P(\text{reaching 1 before $-\infty$ from 0}).$$ Therefore, $M$ is one less a geometrically distributed random variable with parameter $1-p'$, i.e. $$ P(M\ge n) = (p')^n,\ n\ge0, $$ so it remains to identify $p'$. The simplest way is to use a recursion, which is an obvious consequence of the total probability formula, conditioning on the first step: $$ p' = p + (1-p)(p')^2. $$ One of solutions of the latter quadratic equation is $1$ (as it was already mentioned, it is prohibited by LLN), so it is easy to find by Vieta's rule that $$ p' = \frac{p}{1-p}, $$ which confirms @Shalop's answer.

  • 1
    (+1) I like this solution very much. To summarize it briefly, the Markov property tells us that $P(M\geq m+n|M\geq m) = P(M \geq n)$ which implies that $M$ must be geometric of some parameter $\alpha$. Then $\alpha = P($ the walk hits $1$ before $-\infty)$. So by conditioning on the first step, we see that $\alpha = p + (1-p)\alpha^2$ which we can solve. Very concise indeed!2017-02-17