5
$\begingroup$

From http://masi.cscs.lsa.umich.edu/~crshalizi/notabene/ergodic-theory.html

irreducible Markov chains with finite state spaces are ergodic processes, since they have a unique invariant distribution over the states. (In the Markov chain case, each of the ergodic components corresponds to an irreducible sub-space.)

By "ergodic processes", I understand it to be the same as "ergodic measure-preserving dynamic system", if I am correct.

As far as I know an ergodic measure-preserving dynamic system is a mapping $\Phi: T \times S \to S$ that satisfies a couple of properties, where $S$ is the state space, and $T$ is the time space. Sometimes there is a measure preserving mapping on $S$ that can generate the system by repeating itself.

So I wonder how a Markov chain can be written as a mapping $\Phi: T \times S \to S$, and what the measure preserving mapping that generates the Markov chain is?

Thanks!

  • 0
    I found [Robert Gray's book](http://www-ee.stanford.edu/~gray/arp.pdf) really helpful. I thought he does a good job of explaining how to convert back and forth between Markov processes and dynamical systems. The upshot seems to be that you put a probability measure on the output space consisting of sequences generated from the process.2014-04-20

2 Answers 2

5

The article talks about a (stationary) Markov chain ${(X_n)}_{n \in \mathbb{Z}}$ in discrete time with each $X_n$ taking its values in a finite set $E$. The canonical space of the Markov chain is the product set $E^{\mathbb{Z}}$. The trajectory $X=(\ldots, X_{-1}, X_{0}, X_1, \ldots)$ of the Markov chain is a random variable taking its values in $E^{\mathbb{Z}}$. Denoting by $\mu$ its distribution (which could be termed as the law of the Markov process) then $\mu$ is invariant under the classical shift operator $T \colon E^{\mathbb{Z}} \to E^{\mathbb{Z}}$. Then the Markov chain can be considered as the dynamical system $(T,\mu)$. In fact here we only use the fact that ${(X_n)}_{n \in \mathbb{Z}}$ is a stationary process. In the Markov case we can say in addition that the ergodicity of $T$ is equivalent to the irreducibility of ${(X_n)}_{n \in \mathbb{Z}}$.

  • 0
    Thanks! (1) Is the classical shift operator $T$ defined as right shifting a function from Z to E by one unit? (2) Your reply says that a stationary Markov chain induces a measure preserving dynamic system on the space of sample paths. However, when I asked my question, I was thinking about if a Markov chain itself is a (measure preserving or random) dynamic system. Please see my related question here http://math.stackexchange.com/questions/144062/is-a-markov-process-a-random-dynamic-system.2012-11-18
1

The invariant probability measure is the one that assigns to state $i$ the probability $p_i$ defined as the limit, as time goes to infinity, of the fraction of time steps that the chain has been in state $i$.

Strictly speaking $p_i$ is the "almost sure" limit. With probability 1 (in the probability measure on infinitely long walks of the Markov chain) the fraction will converge to $p_i$, and statistically unusual walks where something else happens have probability 0.

  • 0
    Specifying one state is a special case of giving a probability distribution of w$h$at state the chain is in. Then perform the random walk one step. The effect of this is a map from PM to PM.2012-10-20