3
$\begingroup$

This is exercise 1.7.4 in Norris' Markov Chains textbook. I'm having difficulty calculating a simple looking expectation.

Let $(X_n)_{n\geq0}$ be a simple random walk on $\mathbb{Z}$ with transition probabilities $p_{i,i-1}=q0$. Let $\gamma^0_i=\mathbb{E}_0(\sum_{n=0}^{T_0-1}1_{\{X_n=i\}})$, that is the expected time spent in $i$ between visits to $0$. Find $\gamma^0_i$.

I've tried conditioning on $T_0$ but it led to a sum of probabilities that I found tough to evaluate. I've also tried to analyse it as a random walk on $\mathbb{Z_{\geq0}}$ to hopefully make use of the hitting probabilities but got nowhere. Any hints?

(There is a second part to this question using textbook results that suggest that $\gamma^0_i=(p/q)^i$ for $i \leq0$ and $\gamma^0_i=1$ for $i\geq0$, if I calculated it correctly.)

  • 0
    Does $\mathbb E_0$ denote the expected value conditioned on return to $0$?2012-12-19
  • 0
    It denotes the expected value conditional on $X_0=0$.2012-12-19
  • 0
    So $T_0$ can be infinite? It would be slightly unusual to subtract $1$ from infinity :-)2012-12-19
  • 0
    Yes, it can be infinite. The recurrent case is covered by a theorem in the book, maybe I'll try to adapt the proof to get a better idea what's going on.2012-12-19
  • 0
    Ok, a better formulation would be $\gamma^0_i=\mathbb{E_0}(\sum_{n=0}^{\infty}1_{\{X_n=i\text{ and }n+1\leq T_0\}})$2012-12-19
  • 0
    Ok, looking at the proof, it turns out the relation $\gamma^0_j=\sum_k\gamma^0_kP_{k,j}$ can still hold for $j \neq 0$, if I had read it correctly. This in turn gives a recurrence relation $\gamma^0_i=q\gamma^0_{i+1}+p\gamma^0_{i-1}$ that must hold for all $i \neq 0$. So we get a disjoint recurrence relation explaining why $\gamma^i_0$ plateaus for $i \geq 0$.2012-12-20

0 Answers 0