If $X_0,X_1,X_2,...$ is a Markov chain then why is $Z_n=X_0X_1...X_n$ also a Markov chain?
Could you explain briefly?
Many thanks**
If $X_0,X_1,X_2,...$ is a Markov chain then why is $Z_n=X_0X_1...X_n$ also a Markov chain?
Could you explain briefly?
Many thanks**
This isn't true.
For isntance: let $(X_n)$ be the Markov chain defined as follows: Let $P(X_0=1)=P(X_0=2)=\frac{1}{2}$, and determine $X_{n+1}$ from $X_n$ using the transition probabilities $$ p_{11}=\frac{1}{4},\quad p_{12}=\frac{3}{4},\quad p_{21}=\frac{1}{2},\quad p_{22}=\frac{1}{2}. $$ You can show that for $Z_i:=\prod_{j\leq i}X_i$, $$ P(Z_2=4\mid Z_0=1,Z_1=2)=\frac{1}{2} $$ while $$ P(Z_2=4\mid Z_0=2, Z_1=2)=\frac{3}{4}. $$ If $(Z_n)$ were a Markov chain, then we would need to have $$ P(Z_2=4\mid Z_0=1,Z_1=2)=P(Z_2=4\mid Z_1=2)=P(Z_2=4\mid Z_0=2, Z_1=2). $$
Fundamentally, the problem with your assertion is this: knowledge of $Z_n$ is not enough to produce good knowledge of $X_n$ (it is only information about $X_n/X_{n-1}$ essentially), but knowledge of $Z_0,\ldots, Z_n$ IS enough to produce knowledge about $X_0,\ldots,X_n$.