2
$\begingroup$

For a Markov chain (can the following discussion be for either discrete time or continuous time, or just discrete time?),

  1. if for an initial distribution i.e. the distribution of $X_0$, there exists a limiting distribution for the distribution of $X_t$ as $t \to \infty$, I wonder if there exists a limiting distribution for the distribution of $X_t$ as $t \to \infty$, regardless of the distribution of $X_0$?
  2. When talking about limiting distribution of a Markov chain, is it in the sense that some distributions converge to a distribution? How is the convergence defined?

Thanks!

  • 0
    can you word question number one a bit more clearly?2015-11-10

1 Answers 1

4
  1. No, let $X$ be a Markov process having each state being absorbing, i.e. if you start from $x$ then you always stay there. For any initial distribution $\delta_x$, there is a limiting distribution which is also $\delta_x$ - but this distribution is different for all initial conditions.

  2. The convergence of distributions of Markov Chains is usually discussed in terms of $ \lim_{t\to\infty}\|\nu P_t - \pi\| = 0 $ where $\nu$ is the initial distribution and $\pi$ is the limiting one, here $\|\cdot\|$ is the total variation norm. AFAIK there is at least a strong theory for the discrete-time case, see e.g. the book by S. Meyn and R. Tweedie "Markov Chains and Stochastic Stability" - the first edition you can easily find online. In fact, there are also extension of this theory by the same authors to the continuous time case - just check out their work to start with.

  • 0
    For a Markov chain $(X_t)$with given transition distributions, a limiting distribution $\pi$ is defined to be a probability measure on the state space, s.t. for any initial distribution $\nu$ of $X_0$, the distribution of $X_t$ will converge to $\pi$ in total variation norm sense, as $t \to \infty$. Is it the one in your mind?2012-12-02