A fair coin is tossed repeatedly with results $Y_0, Y_1, Y_2, \dots$ that are either $0$ or $1$ with probability $1/2$. For $n \ge 1$, let $X_n = Y_n + Y_{n-1}$ be the number of 1's in the last two tosses. Is $X_n$ a Markov Chain?
My Work
By the definition of a Markov Chain, $X_n$ is a Markov chain if $\mathbf P(X_{n+1} = y \mid X_0 = x_0, \dots, X_n = x) = \mathbf P(X_{n+1} = y \mid X_n = x) $
So I am not sure where to go from here. Do I use my rules of conditional probability to show that these two are equal?
It seems to me that this is not a Markov chain because in order to predict the value of $X_{n+1}$ we need the value of $Y_n$ which we can only get by comparing $X_{n-1}$ and $X_{n}$.