1
$\begingroup$

Let $P$ be a stochastic matrix, and let $v_0$ be a probability vector for the initial distribution. So $v_k=P^kv_0$ is the probability vector for the distribution at time $k$, and $v_{ki}$ is the probability of being in the $i$th state at time $k$.

Is it true that for any fixed $i$, the sequence $$\left(\frac{\sum_{j=1}^kv_{ji}}{k}\right)_{k=1}^\infty$$ converges? Is there a result that guarantees this? I've looked at the Perron-Frobenius theorem, but it doesn't seem to imply anything about this sequence.

Also, if we let $T_k(i,j)$ be the random variable giving the number of transitions from $i$ to $j$ between times $1$ and $k$, then does $T_k(i,j)/k$ converge as well?

1 Answers 1

2

This is a consequence / special case of the ergodic theorem for Markov chains. If $N_k(i)$ is the random variable giving the number of visits to $i$ between times $1$ and $k$, the ergodic theorem asserts that $N_k(i)/k$ converges almost surely. Since this ratio is bounded by $1$, the dominated convergence theorem says that its expectation also converges, and that is precisely the quantity you desire. (Let $\chi_j(i) = 1_{\{X_i = j\}}$ be the indicator of the event that we are in state $i$ at time $j$, and note that $N_k(i) = \sum_{j=1}^k \chi_j(i)$; then use linearity of expectation.)

You can see Durrett's Essentials of Stochastic Processes, Theorem 1.9, for an elementary proof, or his Probability: Theory and Examples, Theorem 6.6.1, for a more measure-theoretic argument. These are under the assumption that the chain is irreducible, but you can reduce to that case by writing $v_0$ as a convex combination of vectors supported on irreducible sets, with a little extra work if there are transient states.

  • 0
    Thanks! And if we let $T_k(i,j)$ be the random variable giving the number of transitions from $i$ to $j$ between times $1$ and $k$, then does $T_k(i,j)/k$ converge as well? I imagine it should be true too, but it doesn't seem to follow directly from the ergodic theorem.2017-02-04
  • 0
    @pi66: Yes, that is also true. You can easily adapt the proof of the statement I gave - the key is that by the strong Markov property, the times between successive transitions from $i$ to $j$ are iid, so the law of large numbers applies. Or, you can look at a modified chain where between each pair of vertices, you insert a new vertex, and then count the number of visits to this intermediate vertex.2017-02-04