3
$\begingroup$

Let $(X_{s},\mathcal{F}_{s})$ be a stochastic process adapted to a given filtration. I was told that, in order to prove that $X$ is Markov, it suffice to prove that for any nonnegative, Borel-measurable function $f$, $$E[f(X_{t}) \vert \mathcal{F}_{s}]=g(X_{s}) \quad a.s.$$ for $\textbf{some}$ Borel function $g$. I get confused here about the arbitrariness of the function g, shouldn't we expect that $g=P_{s,t} \circ f$, where $P_{s,t}$ is a transition function?

So I'm wondering if the above characterization for Markov processes is correct. I know this may be a stupid question. Any help or comment is greatly appreciated.

  • 0
    What is your definition of a Markov process? Anyway, this statement should be read as "for any $f$ there exists $g$"; does that help?2012-08-18
  • 0
    @NateEldredge Is it too strong of a requirement to demand $g = f$?2012-08-18
  • 1
    @Sasha: Yes, that definitely won't hold in general. (It holds iff $f$ is harmonic. In particular, for a finite-state irreducible Markov chain it will fail for every non-constant $f$.)2012-08-18

1 Answers 1

5

Suppose that for $t\geq s$, we have some function $g$ so that $$E[f(X_{t}) \,\vert\, \mathcal{F}_{s}]=g(X_{s}) \quad \mbox{a.s.}\tag1$$ Conditioning on $X_s$ in (1) gives $$E[f(X_{t}) \,\vert\, X_s]=g(X_{s}) \quad \mbox{a.s.}\tag2$$ From this we deduce $$E[f(X_{t}) \,\vert\, \mathcal{F}_{s}]=E[f(X_{t}) \,\vert\, X_s] \quad \mbox{a.s.}\tag3 $$ Equation (3) is one of the equivalent ways of expressing the Markov property. Notice that the arbitrary function $g$ has dropped out of the picture.