I am wondering if anyone could prove the following equivalent definition of recurrent/persistent state for Markov chains:
1) $P(X_n=i,n\ge1|X_0=i)=1$
2) Let $T_{ij}=min\{n: X_n=j|X_0=i\}$, a state $i$ is recurrent if for $\forall j$ such that $P(T_{ij}<\infty)>0$, one has $P(T_{ij}<\infty)=1$