3
$\begingroup$

Suppose $\{X_n\}$ is the hidden Markov chain, and $\{Y_n\}$ is the series of observations, where $\mathbb{P}\{Y_n = j| X_n = i\}$ is the same for all $n$ (please correct me if I have not stated the definition of hidden Markov models correctly). How does one show rigorously that $\{Y_n\}$ does not have the Markov property?

Thank you in advance.

2 Answers 2

3

You cannot show that for every $X_n$ and $Y_n$. For some $X_n$ and $Y_n$, the process $Y_n$ is a Markov process. For instance if $Y_n = X_n$ and $X_n$ is an arbitrary Markov process then $Y_n$ is a Markov process.

On the other hand, if $X_n$ is a random walk on a path $u_1 \to u_2 \to u_3 \to u_4$ of length 4, and $ Y_n = \begin{cases} 0, & \text{if } X_n = u_1\\ 1, & \text{if } X_n \in\{ u_2, u_3\} \\ 2, &\text{if } X_n = u_4\end{cases} $ then $Y_n$ is not a Markov chain (hint: prove that $\Pr[Y_n = 0 \wedge Y_{n+1} = 1 \wedge Y_{n+2} = 2] = 0$).

  • 0
    A necessary and sufficient condition for $(Y_n)$ to be a Markov chain for every starting point $X_0=x$ is well known and easy to write down.2012-12-21
1

Here is an example proving that observations do not have the Markov property in general.

Define a hidden Markov chain with two states, $A$ and $B$, starting at time $n=1$ with both states having probability $1/2$ and with transition probability $0$ of changing state at each step. Define the observation to be $\alpha$ when the HMM is in state $A$, and either $\alpha$ or $\beta$ each with probability $1/2$, when the HMM is in state $B$.

The probability that $Y_3$ is $\alpha$, given that $Y_2$ is $\alpha$ and $Y_1$ is $\beta$, is $1/2$. But the probability that $Y_3$ is $\alpha$, given that $Y_2$ is $\alpha$, is $5/6$.