Let $\{X_n\}$ be a stochastic process. True or false?
$\{X_n\}$ is a Markov chain if and only if for all $n\in \mathbb{N}$, for all $m\leq n$ and $k_1
$$P(X_{n+1}=x_{n+1}~|~X_{k_1}=x_{k_1},\ldots,X_{k_m}=x_{k_m})= P(X_{n+1}=x_{n+1}~|~X_{k_m}=x_{k_m})$$
for all $x_{k_1},\ldots,x_{k_m},~x_{n+1}$ in state space $\mathbb{X}.$
Some thoughts. I came up on this issue, while working on DTMC (my original problem was to show that $\{X_{nk}\}$ is Markov, if $\{X_{n}\}$ is Markov). Markov property is defined as $$P(X_{n+1}=x_{n+1}~|~X_{1}=x_{1},\ldots,X_{n}=x_{n})= P(X_{n+1}=x_{n+1}~|~X_{n}=x_{n})$$ meaning that the distribution of the future state depends on the previous one (and only on that). What happens though when we don' t have knowledge on the previous one? Is the knowledge on the last state enough?
Thanks in advance for the help!