1
$\begingroup$

In a discrete time Markov chain, consider an irreducible/communicating class,

  1. Are the probabilities of ever transition between any two states within the class the same?

    If the class is recurrent, the probabilities of each state ever transitioning back to itself are always 1 and thus the same. Here I wonder about the general case for transition between any two states or/and the class may not be recurrent.

  2. Are the expected times of transition between any two states within the class the same?

    If the class is transient or null positive, the expected times of each state transitioning back to itself are always $\infty$ and thus the same. Here I wonder about the general case for transition between any two states or/and the class may not be transient or null positive.

If the answers are no, are there some other cases where the probabilities/expected times are the same for all the transitions between any two states in an irreducible class?

Thanks and regards!

1 Answers 1

2

You know, examples and counterexamples can be great... So, you might want to check this conjecture, and future ones, on your favorite collection of basic cases. Here, two of the simplest examples are enough to demolish (or at least, to restrict severely the scope of) your questions.

In (1), you think transient, so consider a nearest neighbour random walk on the integer line with probabilities $p$ for $+1$ jumps and $1-p$ for $-1$ jumps, with $p\ne1/2$ (and if you think null recurrent, take $p=1/2$.)

In (2), you think positive recurrent, so consider a nearest neighbour random walk on a discrete circle with probabilities $p$ for clockwise jumps and $1-p$ for anticlockwise jumps.

If periodicity is at stake, replace everywhere $1-p$ by $1-p-q$ and add jumps from any vertex to itself with probability $q$.

Finally, if you think positive recurrent but on an infinite state space, nearest neighbour random walks on a discrete halfline with a drift towards the origin are your friend.