2
$\begingroup$

For a Markov chain (can the following discussion be for either discrete time or continuous time, or just discrete time?),

  1. if for an initial distribution i.e. the distribution of $X_0$, there exists a limiting distribution for the distribution of $X_t$ as $t \to \infty$, I wonder if there exists a limiting distribution for the distribution of $X_t$ as $t \to \infty$, regardless of the distribution of $X_0$?
  2. When talking about limiting distribution of a Markov chain, is it in the sense that some distributions converge to a distribution? How is the convergence defined?

Thanks!

  • 0
    can you word question number one a bit more clearly?2015-11-10

1 Answers 1

4
  1. No, let $X$ be a Markov process having each state being absorbing, i.e. if you start from $x$ then you always stay there. For any initial distribution $\delta_x$, there is a limiting distribution which is also $\delta_x$ - but this distribution is different for all initial conditions.

  2. The convergence of distributions of Markov Chains is usually discussed in terms of $$ \lim_{t\to\infty}\|\nu P_t - \pi\| = 0 $$ where $\nu$ is the initial distribution and $\pi$ is the limiting one, here $\|\cdot\|$ is the total variation norm. AFAIK there is at least a strong theory for the discrete-time case, see e.g. the book by S. Meyn and R. Tweedie "Markov Chains and Stochastic Stability" - the first edition you can easily find online. In fact, there are also extension of this theory by the same authors to the continuous time case - just check out their work to start with.

  • 0
    Thanks! I was wondering if the limiting distribution which is independent of initial distributions is unique when it exists?2012-12-01
  • 0
    @Tim: could you please define the uniqueness? Is far as my guess is true, you mean exactly its independence from the initial distribution.2012-12-01
  • 0
    By "uniqueness" of the limiting distribution, I mean if there are two different probability measures on the state space s.t. they can both be the limiting distribution for a Markov chain, and the limiting distribution is defined to be the same for all initial distributions.2012-12-01
  • 0
    @Tim Given any initial distribution it admits (if admits) a unique limiting distribution. Thus, if the latter is independent of the former, the latter is unique. In other words, suppose there are two limiting distributions $\pi_1$ and $\pi_2$, then $\|\nu_1 P^n - \pi_1\| \to 0$ and $\|\nu_2 P^n - \pi_2\| \to 0$ for some $\nu_1, \nu_2$ which contradicts with the fact that limit of $\nu_1 P^n$ is the same as of $\nu_2 P^n$2012-12-02
  • 0
    Thanks! (1) Do you mean that for any given initial distribution, its limiting distribution (not necessarily the same one for other initial distributions) is unique? Why is that? (2) also I feel the first comment on my another post is confusing http://math.stackexchange.com/questions/248609/relation-between-reversible-distribution-and-limiting-distribution. If you can let me know what you think, that will be appreciated!2012-12-02
  • 0
    @Tim: its time for you to define (formally) the limiting distribution.2012-12-02
  • 0
    For a Markov chain $(X_t)$with given transition distributions, a limiting distribution $\pi$ is defined to be a probability measure on the state space, s.t. for any initial distribution $\nu$ of $X_0$, the distribution of $X_t$ will converge to $\pi$ in total variation norm sense, as $t \to \infty$. Is it the one in your mind?2012-12-02