$X$ is a (continuous time) Markov chain with generator matrix $\Lambda$ and finite state space $G$. I know that for $g\colon G \to R$ $ M_t = g(X_t) - g(X_0) - \int_0^t (\Lambda g)(X_s)\, ds $ is a martingale. Is it obvious how to extend this to $f\colon[0,T] \times G \to R$ where $f$ is $C^1$ wrt $t$ i.e. to show that $ M_t = f(t,X_t) - f(0,X_0) - \int_0^t \left(\frac{\partial f}{\partial u}(u, X_u) +(\Lambda f)(u,X_u)\right)\, du $ is also a martingale? Any help would be appreciated.
Martingale associated to Markov chain
-
0$f$ is bounded since continuous in $t$ and $G$ is finite. Thanks for the reference, will try to find it. – 2012-06-14
2 Answers
If this is still of interest, such process can be represented as a PDMP (piecewise-deterministic Markov process) with time being one of the states. In this book M. Davis gives an explicit characterization of the extended generator of a PDMP together with its domain - i.e. you will find both necessary and sufficient conditions on $f$.
We have to show that for all $t\geqslant s$, $E[M_t\mid\mathcal F_s]=M_s.$ For $u,v\in\Bbb R$, write $g_u(v):=f(u,v)$. We have to show, introducing the semi-group $\{P_t\}$, that \begin{align}g_t(X_t)-g_s(X_s)&=\int_s^tP_{u-s}(\partial_u f(u,X_u)+(\Lambda f)(u,X_u))du\\ &=\int_0^{t-s}P_v(\partial_v g_{v+s}(X_{v+s})+(\Lambda f)(v+s,X_{v+s}))dv. \end{align} So it reduces to show equality between the derivatives. Let $h\neq 0$. Then \begin{align} f(t+h,X_{t+h})-f(t,X_t)&=f(t+h,X_{t+h})-f(t+h,X_t)+f(t+h,X_t)-f(t,X_t)\\ &=(P_h-I)g_{t+h}(X_t)+f(t+h,X_t)-f(t,X_t). \end{align} We conclude using the definition of $\Lambda$ and the continuity of the semi-group.