1
$\begingroup$

In a Markov chain (you can add additional conditions here, such as discrete-time, homogeneous, finite-state, .... But the less additional condition, the better ), what sufficient and/or necessary condition can make every initial distribution have a limit distribution?

Note that here the limit distributions for different initial distributions may be different. Added: What I was thinking when posting the question is to include the case when there does not exist the limiting distribution same for all initial distributions, but there exists a limit distribution for every initial distribution.

Thanks and regards!

My question comes from my comment to Michael Hardy's reply.

  • 0
    There must not be eigenvalues $\lambda$ with $|\lambda|\ge 1$ and $\lambda\ne1$.2012-12-10
  • 0
    @HagenvonEitzen: Thanks! do you mean all eigenvalues of transition matrix have absolute value strictly less than 1? Why is that?2012-12-10
  • 0
    @HagenvonEitzen I think that the eigenvalues of a stochastic matrix can not exceed one by the Perron-Frobenius theorem http://en.wikipedia.org/wiki/Stochastic_matrix2012-12-10
  • 0
    @Learner OK, so we can simply say: There must not be eigenvalues $\lambda$ with $|\lambda|=1$ and $\lambda\ne 1$.2012-12-10
  • 0
    @Tim Yes, e.g. Michael Hardy's example is a case where an eigenvalue $-1$ occurs.2012-12-10
  • 0
    @HagenvonEitzen: can you explain why?2012-12-10
  • 0
    @Tim Consider the Jordan normal form. Block belonging to $\lambda=1$ correspond to limit distributions, parts belonging to blocks with $|\lambda|<1$ will simply §fade away" in the long rund, but vectors corrsponidng to other blocks with $|\lambda|=1$ keep oscillating.2012-12-10

1 Answers 1