1
$\begingroup$

I am wondering if anyone could prove the following equivalent definition of recurrent/persistent state for Markov chains:

1) $P(X_n=i,n\ge1|X_0=i)=1$

2) Let $T_{ij}=min\{n: X_n=j|X_0=i\}$, a state $i$ is recurrent if for $\forall j$ such that $P(T_{ij}<\infty)>0$, one has $P(T_{ij}<\infty)=1$

  • 0
    Is this homework?2011-01-19
  • 0
    Do you want the equivalence of 1 and 2 proven or do you want them proven from some other definition?2011-01-19
  • 0
    This is not homtwork! If I ask homework, I will specifically say that, guys!2011-01-19
  • 0
    @Calle: either way is fine. Do you have some thoughts? Please...2011-01-19
  • 0
    1) Is obvious. I mean it stems from the definition. (the probability that we visit every state repeatedly(multiple times) is 1) 2) Is also obvoius for the same reason(s) I guess what I mean to say is that: what exactly do you mean by a proof? are you looking for Rigor?2011-01-19
  • 0
    @picakhu: Yes, I am looking for rigor. Could you please formalize your argument? Thanks a lot!2011-01-19
  • 0
    Did you get something out of one of the answers below?2011-04-07
  • 0
    Bis repetita: Did you get something out of one of the answers below?2011-07-31

3 Answers 3