1
$\begingroup$

I am wondering if anyone could prove the following equivalent definition of recurrent/persistent state for Markov chains:

1) $P(X_n=i,n\ge1|X_0=i)=1$

2) Let $T_{ij}=min\{n: X_n=j|X_0=i\}$, a state $i$ is recurrent if for $\forall j$ such that $P(T_{ij}<\infty)>0$, one has $P(T_{ij}<\infty)=1$

  • 0
    Bis repetita: Did you get something out of one of the answers below?2011-07-31

3 Answers 3

5

See this page.

Edit: Since incorrect statements continue to pop up on this page, let me explain once again the situation.

First, two notations (standard in the field): for every state $i$, $P_i$ denotes the distribution of the Markov chain $(X_n)_{n\ge0}$ conditionally on the event $[X_0=i]$ and $T_i=\inf\{n\ge1;X_n=i\}$ denotes the first hitting time of $i$. Next, a tautology: the events $[\exists n\ge1,X_n=i]$ and $[T_i<+\infty]$ are equal.

Second, a statement of the question the OP is interested in. One considers two properties which may or may not be satisfied by a given Markov chain $(X_n)_{n\ge0}$ and a given state $i$:

(1) $P_i(\exists n\ge1,X_n=i)=1$.

(2) For every state $j$, either $P_i(T_j<+\infty)=0$ or $P_i(T_j<+\infty)=1$.

Property (1) is often taken as a definition of the fact that $i$ is a recurrent state of the Markov chain $(X_n)_{n\ge0}$. Property (2) is more unusual.

Now, an answer to the question.

Property (1) implies property (2). Proof: see Byron's post. Note that if (1) holds, both $P_i(T_j<+\infty)=0$ and $P_i(T_j<+\infty)=1$ can occur for different states $i$ and $j$ of the same Markov chain. Example: state space $\{0,1,2\}$, transitions $0\to1$, $0\to2$, $1\to1$, $1\to2$, $2\to1$ and $2\to2$ all with positive probabilities, the other transitions with probability zero. Then $P_i(T_0<+\infty)=0$ and $P_i(T_j<+\infty)=1$ for every $i$ and every $j\ne0$.

Property (2) does not imply property (1). Proof by an example: state space $\mathbb{Z}$, transitions $i\to i+1$ with probability $1$. Then $P_i(T_j<+\infty)=0$ if $j\le i$, $P_i(T_j<+\infty)=1$ if $j\ge i+1$, but $P_i(\exists n\ge1,X_n=i)=0$ for every $i$.

Finally (and once again), a congenial reference for this is the first chapter of the book Markov chains by James Norris, which is freely available here.

4

A recurrent state will satisfy property (2), but not the other way around. Take a two state Markov chain that jumps from $i$ to $j$ with probability one, then stays there forever. Then $i$ is transient, but $P(T_{ij}<\infty)=1$.

Edit: Here's a sketch of (1) implies (2) in terms of "taboo probabilities". We assume that $i$ is recurrent, and all of the probabilities below are conditioned on $X_0=i$.

Define $T_i=\inf(n\geq 1 : X_n=i)$ and $T_j=\inf(n\geq 1 : X_n=j)$. We want to show that P(T_j<\infty)>0 implies $P(T_j<\infty)=1$. It suffices to assume $i\not= j$, since $P(T_i<\infty)=1$.

Since $i$ is recurrent, we have $P(T_i < \infty)=1$ so that $P(T_j < T_i)+P(T_i < T_j)=1,$ since $T_i=T_j$ implies $T_i=\infty$.

Also, if P(T_j < \infty)>0 we must have P(T_j < T_i)>0, by considering the shortest possible path from $i$ to $j$ with non-zero probability.

Finally, letting $n$ represent the number of times we return to $i$ without hitting $j$, the strong Markov property gives

$P(T_j < \infty)=\sum_{n=0}^\infty P(T_i < T_j)^n P(T_j < T_i) =\sum_{n=0}^\infty P(T_i < T_j)^n [1-P(T_i < T_j)]=1.$

  • 0
    for your first disproof, I think here $T_{ij}$ can be 0. Thus, in your example, P(T_{ii} \lt \infty)>0 but $P(T_{ii} \lt \infty) \ne 1$. Therefore, $i$ is not recurrent. Can you think of other examples to disprove it, or you can actually prove 2) also implies 1). Thanks a lot for your answer.2011-01-25
0

I think the first property is wrong. That would be a strictly periodic state. Perhaps you missed something?

  • 0
    No, that's not the same. Suppose the propery is true for a particular $n$ (say, for $n=5$) , then it would be that true $P(X_5=i | X_0=i)=1$ Now, that would imply that we are certain that the state 'i' will be revisited in exactly 6 steps. Hence, it would be periodic.2011-01-25