1
$\begingroup$

A fair coin is tossed repeatedly with results $Y_0, Y_1, Y_2, \dots$ that are either $0$ or $1$ with probability $1/2$. For $n \ge 1$, let $X_n = Y_n + Y_{n-1}$ be the number of 1's in the last two tosses. Is $X_n$ a Markov Chain?


My Work

By the definition of a Markov Chain, $X_n$ is a Markov chain if $\mathbf P(X_{n+1} = y \mid X_0 = x_0, \dots, X_n = x) = \mathbf P(X_{n+1} = y \mid X_n = x) $

So I am not sure where to go from here. Do I use my rules of conditional probability to show that these two are equal?

It seems to me that this is not a Markov chain because in order to predict the value of $X_{n+1}$ we need the value of $Y_n$ which we can only get by comparing $X_{n-1}$ and $X_{n}$.

  • 0
    So for $X_n=1$, should I construct a counter example where we incorrectly predict the value of $X_{n+1}$ based only on $X_n$? Showing that we need the previous two states, not just previous 1?2012-09-05

1 Answers 1

1

I don't think it is a Markov chain. $P(X_{n+1} = x_{n+1} | X_n = x_n) = P\left( (Y_{n+1} + Y_n) = (y_{n+1} + y_{n}) | Y_{n} + Y_{n-1} = g\right)$, with $g \in \{0,1,2\}$.

In this situation of knowing information from only the previous state, if $Y_{n-1} = 0$, then $Y_n = g$. If $Y_{n-1} = 1$, then $Y_n = g-1$. So with $P(X_{n+1} = x_{n+1} | X_n = x_n)$, we will only know one number, which is $Y_{n} + Y_{n-1}$. We won't know each individually.

If we knew the full previous information set, however, the range of possible outcomes for $X_{n+1}$ would be smaller.