1
$\begingroup$

In a Markov chain $(X_t)$, is it possible to find an initial distribution for $X_0$, s.t. the distribution of $X_t$ doesn't converge in some sense (such as wrt total variation) as $t \to \infty$?

The MC can be continuous or discrete-time, finite or infinite-state, homogeneous or inhomogeneous, .... But it will be the best that the MC is as special as possible, e.g., discrete-time, finite-state, homogeneous, ....

Thanks and regards!

1 Answers 1

2

Suppose the finitely many states of a Markov chain are partitioned into "even" states and "odd" states, and whenever you're in an "even" state you always go from there to an "odd" state and vice-versa. Then there's no limiting distribution if the initial distribution makes the sum of the probabilities of all the even states equal to $1$.

  • 0
    I was wondering what condition can make every initial distribution has a limit distribution? Note that the limit distributions for different initial distributions may be different.2012-12-10