1
$\begingroup$

I am reading the book Continuous Martingales and Brownian Motion by Revuz and Yor and couldn't understand the "regular conditional distribution" in the following proposition. I will quote relevant definitions first.

Definition: The space $C(\mathbb{R}^+,\mathbb{R}^d)$ is denoted by $\mathbf{W}$, where $d \geq 1$. If $w(s), s \geq 0$ denote the coordinate mappings, we set $\mathscr{B}_t = \sigma(w(s),s \leq t)$.

Definition: Given two predictable functions $f$ and $g$ with values in $d \times r$ matrices and $d$-vectors, a solution of the stochastic differential equation $e(f,g)$ is a pair $(X,B)$ of adapted processes defined on a filtered probability space $(\Omega,\mathscr{F}_t,\mathbb{P})$ such that

  1. $B$ is a standard $\{\mathscr{F}_t\}$ Brownian motion in $\mathbb{R}^r$.
  2. $X_t = X_0 + \int_0^t f(t,X) d B_s + \int_0^t g(t,X) d s$ for any $t$.

Definition: There is uniqueness in law for $e(f,g)$ if whenever $(X,B)$ and $(X',B')$ are two solutions with possibly different Brownian mappings $B$ and $B'$ and $X_0 \stackrel{d}{=} X_0'$, then the laws of $X$ and $X'$ are equal.

Now we come to the proposition and its proof:

Proposition: There is uniqueness in law if, for every $x \in \mathbb{R}^d$, whenever $(X,B)$ and $(X',B')$ are two solutions such that $X_0 = x$ and $X_0' = x$ a.s., then the laws of $X$ and $X'$ are equal.

Proof: Let $P$ be the law of $(X,B)$ on the canonical space $C(\mathbb{R}^+,\mathbb{R}^{d+r})$. Since this is a Polish space, there is a regular conditional distribution $P(\omega,\cdot)$ for $P$ with respect to $\mathscr{B}_0$. For almost every $\omega$ the last $r$ coordinate mappings $\beta^i$ still form a $\text{BM}^r$ under $P(\omega,\cdot)$ and the integral $$ \int_0^t f(s,\xi) d \beta_s + \int_0^t g(s,\xi) d s $$ where $\xi$ stands for the vector of the first $d$ coordinate mappings, makes sense...

I stop quoting the thorough proof as my question has already come:

Question: My understanding is that the regular conditional distribution $P(\cdot, \cdot)$ satisfies that for any measurable $A \subset \mathbf{W}$, $$ P(\omega, A) = \mathbb{E}^P [ \mathbf{1}_A | \mathscr{B}_0] (\omega),\text{ for almost all $\omega$} $$ Then, how can we deduce that "For almost every $\omega$ the last $r$ coordinate mappings $\beta^i$ still form a $\text{BM}^r$ under $P(\omega,\cdot)"$? As far as I know, for any $t_1 < t_2 < \cdots < t_k$ and $\Gamma_j \in \mathscr{B}(\mathbb{R^r}), j = 1,\cdots,k$, $$ P(\omega,\bigcap\beta_{t_j} \in \Gamma_j) = \mathbb{E}^P [ \mathbf{1}_{(\bigcap\beta_{t_j} \in \Gamma_j)} | \mathscr{B}_0] (\omega)= P(\bigcap\beta_{t_j} \in \Gamma_j | \mathscr{B}_0) (\omega) $$ But I can't see if $$ P(\bigcap\beta_{t_j} \in \Gamma_j | \mathscr{B}_0) (\omega) = P(\bigcap\beta_{t_j} \in \Gamma_j) $$ holds so that we can conclude $\beta$ is a $\text{BM}^r$ under $P(\omega,\cdot)$. Maybe my understanding of the regular conditional distribution is completely wrong? Any hint will be greatly appreciated!

1 Answers 1

2

First note that $B$ is independent of $\mathcal F_0 = \mathcal B_0$, by the definition of Brownian Motion. This seems to be the main observation you are missing.

We know if $X$ is a random variable which is independent of a sigma algebra $\mathcal G$ then $E[X|\mathcal G] = E[X]$ almost surely.

So, let $\beta: C(\Bbb R_+, \Bbb R^{d+r}) \to C(\Bbb R_+, \Bbb R^{r})$ denote the last $r$ coordinate mappings and let $A$ be a Borel set in $C(\Bbb R_+, \Bbb R^{r})$. Then the preceding observations tell us that $$P(\omega, \beta^{-1}(A)) = E_P[1_{\beta^{-1}(A)} | \mathcal B_0](\omega) = P[B\in A| \mathcal B_0](\omega) = P(B\in A) \;\;\;\text{ a.s.}$$

  • 0
    Ah..! As the probability space in concern becomes the Wiener space, I was so careless as to overlook it.. Thanks very much!2017-01-31