2
$\begingroup$

Asume $\xi$ and $\eta$ are random variables, then for any $A\in \mathcal{B}(R)$ and $B\in \mathcal{B}(R^2)$, can we have the following equality? $\int_{A}E[I_B(\xi,\eta)\mid \xi=x]P_\xi(x)=\int_{A}E[I_B(x,\eta)\mid \xi=x]P_\xi(x)$ If it is right. Can we replace the indicator function by more general function?

I am learning the conditional expactation recently, and not so familiar with it. Maybe this question is trivial, but I don't know which definition or property can prove it.

Thank you!

  • 0
    It is a really tough concept! I'll have to look it through at some point to see if my top comment makes sense or not in relation to did's answer.2012-12-06

2 Answers 2

2

Let me advise the most extreme caution when manipulating the expression in the RHS, so much that how one could define it correctly is unclear. To see the problem, let us first recall briefly how conditional expectations like the one on the LHS are defined.

For every integrable random variable $\zeta$, one knows that there exists a measurable function $v$ such that $\mathbb E(\zeta\mid\xi)=v(\xi)$ almost surely. This function $v$ is defined uniquely up to Borel sets of probability zero for $\mathbb P_\xi$ by the condition that, for every Borel set $A$, $\mathbb E(\zeta\,;\,\xi\in A)=\mathbb E(v(\xi)\,;\,\xi\in A)$, and $v(\xi)$ is called, by an abuse of language, the conditional expectation of $\zeta$ conditionally on $\xi$ (or, more appropriately, a version of the conditional expectation of $\zeta$ conditionally on $\xi$, since every measurable function $\bar v$ such that $[v\ne\bar v]$ has measure zero for $\mathbb P_\xi$ also fulfills the conditions above).

In particular, if $\zeta=\mathbf 1_{(\xi,\eta)\in B}$, the LHS of the identity written in the question is $\mathbb E(v(\xi)\,;\,\xi\in A)$. One sees that this does not depend on the choice of $v$ since, if $v$ and $\bar v$ are measurable and $[v\ne\bar v]$ has measure zero for $\mathbb P_\xi$, then $[v(\xi)=\bar v(\xi)]$ has measure zero for $\mathbb P$, hence $\mathbb E(v(\xi)\,;\,\xi\in A)$ and $\mathbb E(\bar v(\xi)\,;\,\xi\in A)$ coincide.

So far, so good. Now, to the RHS. How to interpret $\mathbb P((x,\eta)\in B\mid\xi=x)$? Honestly, I do not know!

In the preceding case, we saw that $\mathbb P((\xi,\eta)\in B\mid\xi=x)$ is nothing more than a shorthand for $v(x)$ where the measurable function $v$ is such that $\mathbb P((\xi,\eta)\in B\mid\xi)=v(\xi)$ almost surely, and we explained why, pointwise, $v(x)$ is certainly not unique (except if $\mathbb P_\xi$ has an atom at $x$) and why this indetermination of $v$ does not matter.

But in the present case, should we understand that $\mathbb P((x,\eta)\in B\mid\xi=x)=v_x(x)$ where $v_x$ is such that $\mathbb P((x,\eta)\in B\mid\xi)=v_x(\xi)$ almost surely? But then each $v_x(x)$ can be anything, to the point that the function $x\mapsto v_x(x)$ may even be non measurable...

  • 0
    David Williams, *Probability with martingales*.2012-12-07
1

Let me try and follow the path of the book I mentioned in the comments. This is not an answer (but was to long to fit in a comment) but hopefully it will shed some light on the problem.

Let $(\Omega,\mathcal{F},P)$ be a probability space and $(M,\mathcal{B})$ a measurable space. Let $T:\Omega\to M$ be ($\mathcal{F},\mathcal{B})$-measurable and let $X\in L(P)$. Then we define the conditional expectation of $X$ given $T=t$ to be that/those measurable function(s) $\varphi: M\to\mathbb{R}$ satisfying $ \int_B\varphi(t)\, P_T(\mathrm dt)=\int_{T^{-1}(B)}X\,\mathrm dP,\quad B\in\mathcal{B}.\qquad (*) $ We will write $E[X\mid T=t]:=\varphi(t)$ for any measurable solution to $(*)$. Then $E[X\mid T]=\varphi(T)$ if $\varphi(t)=E[X\mid T=t]$ for all $t\in M$ and $\varphi(t)$ is unique for $P_T$-a.a. $t$.

Let us turn to regular conditional distributions. Let $(L,\mathcal{A})$ be another measurable space and $S:\Omega\to L$ be $(\mathcal{F},\mathcal{A})$-measurable. A regular conditional distribution of $S$ given $T$ is a Markov kernel $P_S^T$ on $(L,\mathcal{A}\mid M,\mathcal{B})$ such that $P_S^T(A\mid t)$ is a conditional distribution of $S$ given $T$, i.e.

1) $Q(\cdot\mid t)$ is a probability measure on $(L,\mathcal{A})$ for all $t\in T$,

2) $Q(A\mid \cdot)$ is $\mathcal{B}$-measurable for all $A\in\mathcal{A}$,

3) $P(S\in A,T\in B)=\int_B P_S^T(A\mid t)\,P_T(\mathrm dt)$ for all $A\in\mathcal{A}$ and $B\in\mathcal{B}$.

If $\psi:L\times M\to \mathbb{R}$ is $(\mathcal{F},\mathcal{A}\otimes\mathcal{B})$-measurable such that $\psi(S,T)\in L(P)$, then

$E[\psi(S,T)]=\int_M \int_L \psi(s,t)\;P_S^T(\mathrm ds\mid t)\, P_T(\mathrm dt),$

$E[\psi(S,T)\mid T=t]=\int_L\psi(s,t)\, P_S^T(\mathrm ds\mid t),$

$E[\psi(S,T)\mid T=t]=E[\psi(S,t)\mid T=t]$

provided that in the equality we use the regular conditional distribution $P_S^T$ to compute the condition expectations.

Isn't it exactly the last equality we are looking for?

  • 0
    @Danielsen: I think did's recommendation is perfect for you :)2012-12-07