I am reading the book Continuous Martingales and Brownian Motion by Revuz and Yor and couldn't understand the "regular conditional distribution" in the following proposition. I will quote relevant definitions first.
Definition: The space $C(\mathbb{R}^+,\mathbb{R}^d)$ is denoted by $\mathbf{W}$, where $d \geq 1$. If $w(s), s \geq 0$ denote the coordinate mappings, we set $\mathscr{B}_t = \sigma(w(s),s \leq t)$.
Definition: Given two predictable functions $f$ and $g$ with values in $d \times r$ matrices and $d$-vectors, a solution of the stochastic differential equation $e(f,g)$ is a pair $(X,B)$ of adapted processes defined on a filtered probability space $(\Omega,\mathscr{F}_t,\mathbb{P})$ such that
- $B$ is a standard $\{\mathscr{F}_t\}$ Brownian motion in $\mathbb{R}^r$.
- $X_t = X_0 + \int_0^t f(t,X) d B_s + \int_0^t g(t,X) d s$ for any $t$.
Definition: There is uniqueness in law for $e(f,g)$ if whenever $(X,B)$ and $(X',B')$ are two solutions with possibly different Brownian mappings $B$ and $B'$ and $X_0 \stackrel{d}{=} X_0'$, then the laws of $X$ and $X'$ are equal.
Now we come to the proposition and its proof:
Proposition: There is uniqueness in law if, for every $x \in \mathbb{R}^d$, whenever $(X,B)$ and $(X',B')$ are two solutions such that $X_0 = x$ and $X_0' = x$ a.s., then the laws of $X$ and $X'$ are equal.
Proof: Let $P$ be the law of $(X,B)$ on the canonical space $C(\mathbb{R}^+,\mathbb{R}^{d+r})$. Since this is a Polish space, there is a regular conditional distribution $P(\omega,\cdot)$ for $P$ with respect to $\mathscr{B}_0$. For almost every $\omega$ the last $r$ coordinate mappings $\beta^i$ still form a $\text{BM}^r$ under $P(\omega,\cdot)$ and the integral $$ \int_0^t f(s,\xi) d \beta_s + \int_0^t g(s,\xi) d s $$ where $\xi$ stands for the vector of the first $d$ coordinate mappings, makes sense...
I stop quoting the thorough proof as my question has already come:
Question: My understanding is that the regular conditional distribution $P(\cdot, \cdot)$ satisfies that for any measurable $A \subset \mathbf{W}$, $$ P(\omega, A) = \mathbb{E}^P [ \mathbf{1}_A | \mathscr{B}_0] (\omega),\text{ for almost all $\omega$} $$ Then, how can we deduce that "For almost every $\omega$ the last $r$ coordinate mappings $\beta^i$ still form a $\text{BM}^r$ under $P(\omega,\cdot)"$? As far as I know, for any $t_1 < t_2 < \cdots < t_k$ and $\Gamma_j \in \mathscr{B}(\mathbb{R^r}), j = 1,\cdots,k$, $$ P(\omega,\bigcap\beta_{t_j} \in \Gamma_j) = \mathbb{E}^P [ \mathbf{1}_{(\bigcap\beta_{t_j} \in \Gamma_j)} | \mathscr{B}_0] (\omega)= P(\bigcap\beta_{t_j} \in \Gamma_j | \mathscr{B}_0) (\omega) $$ But I can't see if $$ P(\bigcap\beta_{t_j} \in \Gamma_j | \mathscr{B}_0) (\omega) = P(\bigcap\beta_{t_j} \in \Gamma_j) $$ holds so that we can conclude $\beta$ is a $\text{BM}^r$ under $P(\omega,\cdot)$. Maybe my understanding of the regular conditional distribution is completely wrong? Any hint will be greatly appreciated!