3
$\begingroup$

Let us consider a Markov process in discrete time $X = (X_n)_{n\geq 0}$ which values belong to $\mathbb{R}$ and $X_0 = x$. For each $x\in\mathbb{R}$ there is a probability measure $\mathsf P_x$. When one would like to make $X$ be a martingale, the change of measure can help - namely the process $Z_n \in \mathcal F_n$ such that $$ \frac {dQ}{dP}|_{\mathcal F_n} = Z_n $$ and $X$ is a $Q$-martingale. I am a bit confused since for a Markov process there is family of measures $(\mathsf P_x)_{x\in\mathbb{R}}$ rather than a unique one. How one defines a change of measure for a Markov process?

I guess that here $Z_n = f(X_n)$ - but I am looking for the formal definition of change of measure.

1 Answers 1