Let $\{X_t\}_{t\geq0}$ be a stochastic process with state space $I=\{1,\ldots,n\}$ (that is, $I$ is the set of values that each $X_t$ may take) and call $c(t)=(P(X_t=1),\ldots,P(X_t=n))$.
Suppose the following model: $c'(t)=c(t)Q$, where $Q$ is an $n\times n$ matrix such that: $Q_{ii}\leq 0$, $Q_{ij}\geq0$ for $i\neq j$ and $\sum_{j=1}^n Q_{ij}=0$ for all $i$.
I would like to know if $\{X_t\}_{t\geq0}$ is a Markov chain in continuous time.
Intuition: I think this should be true: we have, for a very small $h$, $$\frac{P(X_{t+h}=j)-P(X_t=j)}{h}\approx \frac{d}{dt}P(X_t=j)=\sum_{i=1}^n P(X_t=i)\,Q_{ij}\;\;\Rightarrow$$ $$P(X_{t+h}=j)\approx\sum_{i\neq j}P(X_t=i)(h\,Q_{ij})+(1+h\,Q_{jj})P(X_t=j).$$ So, if we look the process at $h$ units of time, $X_{t+h}$ depends on the past only from $X_t$, that is, we have a discrete Markov chain.
Question: I would like a formal proof of $\{X_t\}_{t\geq0}$ being a Markov chain in continuous time.