Suppose $p=\begin{bmatrix} 0& 1\over 3 &0 &2\over 3 \\ 0.3& 0& 0.7 &0 \\ 0& 2\over 3&0 &1\over3 \\ 0.8& 0& 0.2& 0 \end{bmatrix}$is the transition probability matrix of a Markov chain with state space {1, 2, 3, 4}. How to get the limiting probabilities of the Markov chain. I think if more different alternative answer and apporach would be better. In fact i know a little bit about limiting probability but not sure how to apply in this question. Clear step to illustrate how it works or explaination would be appreciated and i wanna learn how others interpret the concept of probability
A question about how to get the limiting probability.
5
$\begingroup$
probability
probability-theory
markov-chains
-
0Note that each state has period two, since if you start in positions $1$ or $3$, then after one step you will be at $2$ or $4$ and then after another back at $1$ or $3$. – 2012-11-20