In a Markov chain $(X_t)$, is it possible to find an initial distribution for $X_0$, s.t. the distribution of $X_t$ doesn't converge in some sense (such as wrt total variation) as $t \to \infty$?
The MC can be continuous or discrete-time, finite or infinite-state, homogeneous or inhomogeneous, .... But it will be the best that the MC is as special as possible, e.g., discrete-time, finite-state, homogeneous, ....
Thanks and regards!