1
$\begingroup$

I have a question on irreducible Markov Chains that has been bugging me for a few hours now I have the markov chain defined by:
$P(i, i-1) = 1 - P(i,i+1) = \frac{1}{2(i+1)}$ for $i>=1$, and $p(0,1) = 1$.

Now this chain is irreducible and I'm asked to prove that a.s. starting from state $i$ we hit the state $a$ when $a > i$ in a finite amount of time. I think it can be proven by saying that for all states between $0$ and $a$, we have a probability $ p > (a+2) /2*(a+1) > 0.5$ to do +1, so I think I can compare the markov chain to a random walk of uniform probability $\frac{a+2}{2(a+1)}$ which tends towards $+\infty$.

Is my reasoning correct? I feel that either there is a much more direct method or that what I'm trying to prove is true in general as long as the chain is irreductible (regardless of the transition probabilities)

Thanks!

  • 0
    What do you mean by *in a finite amount of time*? Almost surely? Then indeed the specifics of the transition probabilities are irrelevant.2012-12-16
  • 0
    @did : I mean that almost surely we hit the state a starting from state i2012-12-16
  • 0
    Then the chain starting from `i` before it hits `a > i` is irreducible on a finite state space hence it visits each state almost surely, including state `a`. This proves that one hits `a` almost surely, irrespectvely of the detail of the transition probabilities.2012-12-16
  • 0
    @did : thanks but I'm not entirely sure. Indeed it is not possible that we almost surely never hit state a. But does that prove that we almost surely hit state a? What if the probability to hit state a starting from i is something < 1?2012-12-16

0 Answers 0