1
$\begingroup$

Consider the markov chain ${(X_n,n\geq 0)}$ with states $0,1,2$ whose transition $P$ matrix is

$P=\begin{pmatrix}0&\frac12&\frac12\\ \frac12&\frac12&0 \\ 1&0&0\end{pmatrix}$

Let f(0)=0, f(1)=f(2)=1,

If $Y_n=(f(X_n))$, is $[Y_n,n\geq 0]$ a markov chain?

Answer: Here i don't know how to compute transition matrix of $[Y_n,n\geq 0]$.

Would anyone answer this question?

1 Answers 1

1

Note that what $f$ does is simply "merge" together states $1$ and $2$. So $$Y_n=\begin{cases}0 & X_n=0\\ Y_n=1 & X_n=1 \mbox{ or }X_n=2\end{cases}\ .$$ Assume now that $Y_n=1$. That means that $X_n$ is either $2$ (and then $X_{n+1}=Y_{n+1}=0$ with probability $1$) or $X_n=1$ (and then $X_{n+1}$ is either $2$ or $0$, and thus $Y_{n+1}$ is either $1$ or $0$). This suggests that there are correlations which are not allowed for pure Markov chains.

To see this explicitly, note that if $Y_{n-1}=Y_{n}=1$ this must mean that $X_{n-1}=1$ and $X_{n}=2$ (because if $X_{n-1}=2$ then $X_n=0$ with probability $1$). Therefore, $$Pr(Y_{n+1}=0\ |\ Y_{n}=1\ \mbox{ and } \ Y_{n-1}=1)=1\ .$$ In contrast, asimilar calculation shows that $$Pr(Y_{n+1}=0\ |\ Y_{n}=1\ \mbox{ and } \ Y_{n-1}=0)=\frac{3}{4}\ .$$ This shows that $Y_n$ is not a Markov chain.

  • 0
    This argument assumes that the second row of $P$ is $[\frac{1}{2}, 0, \frac{1}{2}]$. The conclusion that $(Y_n, n \geq 0)$ is not a Markov chain remains correct even for the original $P$.2018-01-30