1
$\begingroup$

Let $\{X_n\}$ be a stochastic process. True or false?

$\{X_n\}$ is a Markov chain if and only if for all $n\in \mathbb{N}$, for all $m\leq n$ and $k_1

$$P(X_{n+1}=x_{n+1}~|~X_{k_1}=x_{k_1},\ldots,X_{k_m}=x_{k_m})= P(X_{n+1}=x_{n+1}~|~X_{k_m}=x_{k_m})$$

for all $x_{k_1},\ldots,x_{k_m},~x_{n+1}$ in state space $\mathbb{X}.$

Some thoughts. I came up on this issue, while working on DTMC (my original problem was to show that $\{X_{nk}\}$ is Markov, if $\{X_{n}\}$ is Markov). Markov property is defined as $$P(X_{n+1}=x_{n+1}~|~X_{1}=x_{1},\ldots,X_{n}=x_{n})= P(X_{n+1}=x_{n+1}~|~X_{n}=x_{n})$$ meaning that the distribution of the future state depends on the previous one (and only on that). What happens though when we don' t have knowledge on the previous one? Is the knowledge on the last state enough?

Thanks in advance for the help!

1 Answers 1

1

Yes, it is true. Lets prove it by induction over $n$. Base step for $n=1$ is trivial. Lets take a look at

$$P = P(X_{n+1}=x_{n+1}~|~X_{k_1}=x_{k_1},\ldots,X_{k_m}=x_{k_m})$$ If $k_m = n$ then we just use the definition of Markov chain. Let be $k_m < n$. Then

$$P = \int P(x_{n+1}~|~x_{k_1},\dots,x_{k_m},X_n=x_{n})P(x_{n}~|~x_{k_1},\ldots,x_{k_m})\,dx_n = \\ \int P(x_{n+1}~|x_{n})P(x_{n}~|x_{k_m}) \,dx_n, $$ where the last step is provided by induction. From the definition of Markov chain we have $P(x_{n+1}~|x_{n}) = P(x_{n+1}~|x_{k_m}, x_{n})$. So, the interal is $$ \int P(x_{n+1}~|x_{n})P(x_{n}~|x_{k_m}) \,dx_n, = \int P(x_{n+1}~|x_{k_m},x_{n})P(x_{n}~|x_{k_m}) \,dx_n, = P(x_{n+1}~|x_{k_m}). $$

  • 0
    Thank you very much. If the state space was countable, would we use series instead of integrals i guess.2017-02-24
  • 1
    @NikolaosSkout, definitely2017-02-24