2
$\begingroup$

Ok, I've already posted this a minute ago, but my text deleted itself while I was editing it :-( So next try:

Can you help me to understand the notation my professor uses to describe Markov processes? Let $(S,\Sigma)$ be a measurable space. We denote the Markov property as $$\mathbb{P}(X_{n+1}=x_{n+1}|X_0=x_0,\dots,X_n=x_n)=P(x_n,x_{n+1}),$$ so apparently $P$ denotes the transition matrix, here only dependent on $x_n$ and $x_{n+1}$. But then we spoke about the Chapman-Kolmogorov relation, which we denoted by $$P_{s+t}(x,B)=\int P_s (x,dy)P_t (y,B),$$ where $x\in S$ and $B\in \Sigma$. Have you seen this notation and can you explain to me what it means? (Unfortunately it was not explained in the lecture).

Thank you very much!

1 Answers 1

4

This uses two conventions at once:

  • For every integer $t\geqslant0$, $P_t$ is the transition matrix of $t$ steps of the Markov chain with transition matrix $P$. Hence $P_0$ is the identity and $P_{t+1}=P_tP=PP_t$ for every $t\geqslant0$.
  • For every transition matrix $P$ and every $B$ in $\Sigma$, $P(x,B)=\sum\limits_{y\in B}P(x,y)$.

In the discrete setting which seems to be considered here, Chapman-Kolmogorov equation rather reads as $P_{s+t}(x,B)=\sum\limits_{y\in S}P_s(x,y)P_t(y,B)$.

The integral notation is often used to describe Markov processes on continuous state spaces (but the Markov property as you wrote it in your post definitely applies to the discrete setting).

Edit: When the state space is continuous, each $P_t(x,\cdot)$ is actually a probability measure on $(S,\Sigma)$, which is defined, for every $B$ in $\Sigma$, by $$ P_t(x,B)=\mathrm P(X_t\in B\mid X_0=x). $$ In other words, $P_t(x,\cdot)$ is the distribution of $X_t$ conditionally on the event $[X_0=x]$ hence, for every measurable bounded function $\varphi$ on $S$, $$ \mathrm E(\varphi(X_t)\mid X_0=x)=\int_S\varphi(y)P_t(x,\mathrm dy). $$ The Chapman-Kolmogorov notation you recall in your post means that $$ \mathrm P(X_{t+s}\in B\mid X_0=x)=\int_S\mathrm P(X_{t+s}\in B\mid X_t=y)\,P_t(x,\mathrm dy), $$ since, by stationarity, $$ \mathrm P(X_{t+s}\in B\mid X_t=y)=\mathrm P(X_s\in B\mid X_0=y)=P_s(y,B). $$

  • 1
    Thank you for your answer! We are treating Markov process also in continuous time, only the introduction was in discrete time setting. But hence it is essential that I understand the integral notation. What does $P(x,dy)$ mean?2012-04-28
  • 0
    About your Edit: So the integral is just a Lebesgue integral with respect to the probability measure $P_t(x,\cdot)$? But why do we write $\int P_s(x,dy)P_t(y,B)$ and not $\int P_s(y,B)P_s(x,dy)$ as usually when writing a Lebesgue integral?2012-04-28
  • 0
    Or is that just in order to confuse me? :-)2012-04-28
  • 0
    Many people use indistinctly $\int\varphi(y)\mathrm d\mu(y)$ and $\int\mathrm d\mu(y)\varphi(y)$ (as well as a few other variations) for the (Lebesgue) integral of the function $\varphi$ with respect to the measure $\mu$.2012-04-28
  • 2
    About your first comment: the continuous time is not an issue for the notation $P(x,dy)$, rather the fact that the **space** is continuous.2012-04-28
  • 0
    If the transition functions are of the form $$P_{s,t}(x, A), s < t, x \in S, A \in \Sigma$$ and Chapman - Kolmogorov equation $$P_{s,t} (x, A) = \int P_{s,u} (x, dy) P_{u, t} (y, A)$$ for every $s < u < t, x \in S, A \in \Sigma$, what does it mean $P_{s, t} (x, A)$?2015-01-12
  • 1
    @rodie $$P(X_t\in A\mid X_s=x)$$2015-01-12
  • 0
    Thank you for your answer!2015-01-13