0
$\begingroup$

Consider a random walk $X_j$ on $\mathbf{Z}$ that starts at $X_0 = k \in \{1, 2, \dots, N-1\}$. Let $T$ be the random time defined by $T = \min \{j | X_j \in \{0,N\}\}$ . Then if $prob(X_{j+1}>X_j) = p$ is constant, a closed form of the expected run time $\mathcal{E} T$ is well known (gambler's ruin).

What can one say about $T$ when $p_j = prob(X_{j+1} > X_j)$ is not constant? For starters, I'm interested in the case where $p_j = p$ for $j < K$ and $p_j = q$ for $j \ge K$, where $K$ is not too large and fixed.

Edit for clarification: Note that $p_j$ is not assumed to depend on the position $X_j$ of the particle, only on the time $j$ since the particle began its walk.

2 Answers 2

2

Let $S=\{1,2,\ldots,N-1\}$. For every $k$ in $S$ and every $j\geqslant0$, let $t^j_k$ denote the mean number of steps before hitting $\{0,N\}$ for the random walk starting from $k$ at time $j\geqslant0$. Let $t_j=(t_k^j)_{k\in S}$.

Since the random walk, when at site $k$ at time $j$, can only move to $k+1$ and to $k-1$, with probabilities $p_j$ and $1-p_j$ respectively, $t_j$ and $t_{j+1}$ solve the linear system $ t_k^j=1+p_jt^{j+1}_{k+1}+(1-p_j)t^{j+1}_{k-1},\qquad k\in S, $ with the convention that $t_0^j=t_N^j=0$ for every $j\geqslant0$.

If $p_j=q$ for every $j\geqslant K$, then $t_j=t_{j+1}$ for every $j\geqslant K$, hence $t_K$ solves the linear system $ t_k^K=1+qt^{K}_{k+1}+(1-q)t^{K}_{k-1},\qquad k\in S. $ If $p_j=p$ for every $j\lt K$, then, once $t_K$ is known, for each $j\lt K$, $t_j$ and $t_{j+1}$ solve the linear system $ t_k^j=1+pt^{j+1}_{k+1}+(1-p)t^{j+1}_{k-1},\qquad k\in S. $ Iterating this $K$ times starting from $t_K$, this yields finally $t_0$, hence every $t_k^0$.

Edit: Using the shorthand $r(q)=(1-q)/q$, when $q\ne\frac12$, one can write the solution $t_K$ as $ t_k^K=\frac{k}{1-2q}-\frac{N}{1-2q}\frac{1-r(q)^k}{1-r(q)^N}. $

0

Well, you can compute the probabilities of $X_1=i \in \{0,\ldots, N\}$, they will be $p$ for $i=k+1$ and $(1-p)$ for $i=k-1$. Actually it is the result of multiplying vector $e_k^T$ by the transition matrix $P$. You can continue that process and when the process is in $0$ or $N$ with probability $r$ at time $t$ you add $rt$ to the expected result and change the first and the last coordinate of your vector $v$ into $0$ (the sum won't be $1$, but we don't actually care). After $K$ steps you get some vector $v$ such that $v_0=v_N=0$. Now it's just gambler's ruin starting from different points $0 \lt i \lt N$. So the result is a) the result until $K$ + b) expected times for all $0 \lt i \lt N$ enlarged by $K$ and multiplied by the appropriate probabilities $v_i$.