I'm studying Stochastic Processes by Richard F. Bass. Within this book I encountered the definition of a Markov process, which is given as follows:
We are given a separable metric space $S$ endowed with its Borel $\sigma$-field and a measurable space $(\Omega, \mathcal{F})$ together with a filtration $\{\mathcal{F}_t\}$. Then
Definition 19.1 A Markov process $(X_t , \Bbb{P}^x)$ is a stochastic process $X : [0, \infty) \times \Omega \to S$ and a family of probability measures $\{\Bbb{P}^x : x \in S\}$ on $(\Omega, \mathcal{F})$ satisfying the following.
- For each $t$, $X_t$ is $\mathcal{F}_t$ measurable.
- For each $t$ and each Borel subset $A$ of $S$, the map $x \mapsto \Bbb{P}^x (X_t \in A)$ is Borel measurable.
- For each $s, t \geq 0$, each Borel subset $A$ of $S$, and each $x \in S$, we have $ \Bbb{P}^x(X_{s+t} \in A \mid \mathcal{F}_s) = \Bbb{P}^{X_s} (X_t \in A), \quad \Bbb{P}^x − \text{a.s.}$
Okay, this definition was fine. Then in the next chapter, the author begins with the setting that $(X_t , \Bbb{P}^x)$ is a Markov process with respect to $\mathcal{F}_t^{00} = \sigma(X_s : s \leq t)$ such that its sample path is càdlàg with probability 1 under $\Bbb{P}^x$ for all $x \in S$. With $ \mathcal{F}_t^{0} = \sigma \left(\mathcal{F}_t^{00} \cup \{ A \subset S : A \text{ is } \Bbb{P}^x\text{-null for all } x \in S\} \right) \quad \text{and} \quad \mathcal{F}_t = \mathcal{F}_{t+}^{0} = \bigcap_{\epsilon > 0} \mathcal{F}_{t+\epsilon}^{0},$
he proved the following property:
Theorem 20.6 Let $(X_t , \Bbb{P}^x)$ be a Markov process and suppose for all bounded Borel measurable function $f$, $ \Bbb{E}^x[ f(X_{s+t}) \mid \mathcal{F}_s] = \Bbb{E}^{X_s} [ f(X_t)], \quad \Bbb{P}^x − \text{a.s.} $ holds. Suppose $Y$ is bounded and measurable with respect to $\mathcal{F}_{\infty} = \bigvee_{s \geq 0} \mathcal{F}_s $. Then $ \Bbb{E}^x[Y \circ \theta_s \mid \mathcal{F}_s] = \Bbb{E}^{X_s} Y, \quad \Bbb{P}^x − \text{a.s.}$
Until now, still there was no problem. Then it claims the Blumenthal 0-1 law, which is stated as follows:
Proposition 20.8 Let $(X_t , \Bbb{P}^x)$ be a Markov process with respect to $\{\mathcal{F}_t\}$. If $A \in \mathcal{F}_0$, then for each $x$, $\Bbb{P}^x(A)$ is equal to $0$ or $1$.
Proof. Suppose $A \in \mathcal{F}_0$. Under $\Bbb{P}^x$, $X^0 = x$, a.s., and then $\Bbb{P}^x(A) = \Bbb{E}^{X_0}\mathbf{1}_A = \Bbb{E}^x[\mathbf{1}_A \circ \theta_0 | \mathcal{F}_0] = \mathbf{1}_A \circ \theta_0 = \mathbf{1}_A \in \{0, 1\}, \quad \Bbb{P}^x - \text{a.s.} $ since $\mathbf{1}_A \circ \theta_0$ is $\mathcal{F}_0$ measurable. Our result follows because $\Bbb{P}^x(A)$ is a real number and not random.
I was puzzled by the claim of the proof, that $X_0 = x$ a.s. under $\Bbb{P}^x$. Clearly there is no such assumption in Definition 19.1 above. So I tried to prove this using the definition. I succeeded in circumventing this seemingly unjustified claim when there is some $x_0 \in S$ such that $\Bbb{P}^{x}(X_0 = x_0) > 0$, but my approach turned out to be inadequate for a general proof.
So here is my question: Is there an argument which either avoids or proofs the claim $\Bbb{P}^x (X_0 = x) = 1$? Or is there a counter-example, so that we should just embrace this as a part of definition and insert it to the incomplete Definition 19.1?