2
$\begingroup$

Suppose $X_t$ is a first-order Markov Chain with state space $\{-1, 0, 1\}$, and transition matrix $P$,

Is $|X_t|$ (absolute value) necessarily a Markov chain?

Thanks!

1 Answers 1

1

Not in general.

For a simple counterexample, assume that $P(1,0)\lt1$, $P(1,-1)=0$ and $P(-1,0)=1$. Then, $ A=[|X_{t-2}|=|X_{t-1}|=1]=[X_{t-2}=X_{t-1}=1], $ hence $\mathbb P(|X_t|=0\mid A)=P(1,0)$ while $\mathbb P(|X_t|=0\mid |X_{t-1}|=1)$ is a barycenter of $P(1,0)$ and $P(-1,0)=1$, hence is larger than $P(1,0)$ as soon as $\mathbb P(X_{t-1}=-1)\ne0$, which is the case if, for example, every non specified transition probability is positive.

The general condition for the process obtained by lumping some states of a Markov chain to be again a Markov chain is known as lumpability. In your case, it reads $P(1,0)=P(-1,0)$.

  • 0
    Thanks! This is very helpful!2012-11-01