1
$\begingroup$

Could anyone give me an example of a problem where it is requested to prove rather than assume that a stochastic process forms a Markov chain. I can think of something like this: if $X_{n+1} = X_{n} + \xi_{n}$ it is necessary to prove that $ \mathbf{E}[\xi_{n}|\xi_{n-1}] = \mathbf{E}[\xi_{n}]$, random walk-type process.

I would be especially grateful if one could come up with an example where a function of MC created by lumping some states is proven to be an MC.

  • 0
    yes thanks Didier2012-06-29

1 Answers 1

2

A famous example is the Ehrenfest process, introduced in 1911 by the physicists Paul Ehrenfest and Tatyana Ehrenfest-Afanaseva as a toy model to explain the macroscopic irreversibility of some microscopically reversible dynamics. The process is defined as follows.

Start with a hypercube $K=\{0,1\}^I$ of dimension $n$, thus $I$ can be any set of size $n$. The simple random walk $(X_t)_{t\geqslant0}$ on the graph with vertex set $K$ and the usual edge set is a Markov chain, such that, for every $t\geqslant0$, all the coordinates of $X_{t+1}$ are those of $X_t$ except one, and the exception is chosen randomly uniformly in $I$.

Now, let $S_t$ count the number of ones amongst the coordinates of $X_t$. The surprising fact is that $(S_t)_{t\geqslant0}$ is also a Markov chain (and more precisely, a birth-and-death chain). This new chain is defined on the set $L=\{0,1,\ldots,n\}$ and has transition probabilities, for every $i$ in $L$, $ Q(i,i-1)=i/n=1-Q(i,i+1). $ The funny thing is that $S_t=u(X_t)$ for the non injective function $u:K\to L$ one can imagine, and that, as said above, $(S_t)_{t\geqslant0}$ is still a Markov chain although, in general, such a non injective lumping of a Markov chain does not produce a Markov chain but rather what is called a hidden Markov chain.

A sufficient condition on any lumping to produce a Markov chain, which the Markov process $(X_t)_{t\geqslant0}$ on $K$ and the Ehrenfest function $u$ from $K$ to $L$ happen to satisfy, is well known. Call $q$ the transition matrix of $(X_t)_{t\geqslant0}$. Then, the sufficient condition is that the sum $ \sigma(x,s)=\sum\limits_{y:u(y)=s}q(x,y) $ depends on $u(x)$ and $s$ only, but not on $x$. In other words, one asks that $\sigma(x,s)=\sigma(x',s)$ as soon as $u(x)=u(x')$. Then $(S_t)_{t\geqslant0}$ is a Markov chain whose transition matrix $Q$ is defined by $Q(r,s)=\sigma(x,s)$ for any $r$ and $s$ and any $x$ such that $u(x)=r$.

  • 0
    Yes: start from $x$ in $K$ and count how many of its neighbors in $K$ have exactly one $1$-coordinate more and how many have exactly one $1$-coordinate less.2012-03-06