1
$\begingroup$

Say I have an irreducible Markov chain with state space $\{1, 2, 3 ... m\}$, where $m > 2$ and stationary distribution $s = \{s_1, s_2, ... s_m\}$. The initial state is given by the stationary distribution, so $P(X_0 = i) = s_i$.

Why is it that all of $X_0, X_1 ... X_n$ have the stationary distribution? Apparently, it's because of how $X_0$ does... how does that work?

  • 1
    This is what "stationary" *means*.2012-12-11

1 Answers 1

1

This is because, by definition, the stationary distribution satisfies $\pi P = \pi$ where $\pi$ is the stationary distribution (row vector) and $P$ is the matrix of transition probabilities.

Remember that at time $t$, $\pi_{t} P = \pi_{t+1}$ where $\pi_\tau$ is the probability at time $\tau$. This means if you set $\pi_t=\pi$ you end up with $\pi_{t+1}=\pi$.

  • 1
    @Mathematics That is not the question here. The question is, assume you start with $\pi$, the distribution of $X_0$, why is it then that, at the next step, the distribution of $X_1$ is still $\pi$? Iterating, $X_2,...,X_n$ will all have (marginally) distribution $\pi$.2012-12-11