2
$\begingroup$

Asume $\xi$ and $\eta$ are random variables, then for any $A\in \mathcal{B}(R)$ and $B\in \mathcal{B}(R^2)$, can we have the following equality? $$\int_{A}E[I_B(\xi,\eta)\mid \xi=x]P_\xi(x)=\int_{A}E[I_B(x,\eta)\mid \xi=x]P_\xi(x)$$ If it is right. Can we replace the indicator function by more general function?

I am learning the conditional expactation recently, and not so familiar with it. Maybe this question is trivial, but I don't know which definition or property can prove it.

Thank you!

  • 0
    In the case that no one comes up with an answer, you can have a look at _Probability With a View Towards Statistics, Volume II_ by J. Hoffmann-Jørgensen. He has an excellent treatment of regular conditional probabilities/distributions, and I'm pretty sure that he proves $E[\psi(\xi,\eta)\mid\xi=x]=E[\psi(x,\eta)\mid \xi=x]$ a.s. for suitable $\psi$.2012-12-06
  • 0
    @StefanHansen Not sure about this, for the reasons explained in my answer.2012-12-06
  • 0
    @StefanHansen Thanks for your advice! Regular conditional probabilities/distributions is quite a tough conception for me. I'll trying find the answer in the book.2012-12-06
  • 0
    It is a really tough concept! I'll have to look it through at some point to see if my top comment makes sense or not in relation to did's answer.2012-12-06

2 Answers 2

2

Let me advise the most extreme caution when manipulating the expression in the RHS, so much that how one could define it correctly is unclear. To see the problem, let us first recall briefly how conditional expectations like the one on the LHS are defined.

For every integrable random variable $\zeta$, one knows that there exists a measurable function $v$ such that $\mathbb E(\zeta\mid\xi)=v(\xi)$ almost surely. This function $v$ is defined uniquely up to Borel sets of probability zero for $\mathbb P_\xi$ by the condition that, for every Borel set $A$, $\mathbb E(\zeta\,;\,\xi\in A)=\mathbb E(v(\xi)\,;\,\xi\in A)$, and $v(\xi)$ is called, by an abuse of language, the conditional expectation of $\zeta$ conditionally on $\xi$ (or, more appropriately, a version of the conditional expectation of $\zeta$ conditionally on $\xi$, since every measurable function $\bar v$ such that $[v\ne\bar v]$ has measure zero for $\mathbb P_\xi$ also fulfills the conditions above).

In particular, if $\zeta=\mathbf 1_{(\xi,\eta)\in B}$, the LHS of the identity written in the question is $\mathbb E(v(\xi)\,;\,\xi\in A)$. One sees that this does not depend on the choice of $v$ since, if $v$ and $\bar v$ are measurable and $[v\ne\bar v]$ has measure zero for $\mathbb P_\xi$, then $[v(\xi)=\bar v(\xi)]$ has measure zero for $\mathbb P$, hence $\mathbb E(v(\xi)\,;\,\xi\in A)$ and $\mathbb E(\bar v(\xi)\,;\,\xi\in A)$ coincide.

So far, so good. Now, to the RHS. How to interpret $\mathbb P((x,\eta)\in B\mid\xi=x)$? Honestly, I do not know!

In the preceding case, we saw that $\mathbb P((\xi,\eta)\in B\mid\xi=x)$ is nothing more than a shorthand for $v(x)$ where the measurable function $v$ is such that $\mathbb P((\xi,\eta)\in B\mid\xi)=v(\xi)$ almost surely, and we explained why, pointwise, $v(x)$ is certainly not unique (except if $\mathbb P_\xi$ has an atom at $x$) and why this indetermination of $v$ does not matter.

But in the present case, should we understand that $\mathbb P((x,\eta)\in B\mid\xi=x)=v_x(x)$ where $v_x$ is such that $\mathbb P((x,\eta)\in B\mid\xi)=v_x(\xi)$ almost surely? But then each $v_x(x)$ can be anything, to the point that the function $x\mapsto v_x(x)$ may even be non measurable...

  • 0
    Thank you for your answer. The LHS is very clear as you say. But is it more adequate if we say $$E(\xi\mid\xi\in A)=\{v(\xi)\mid\xi\in A\}$$I think each side is a set in $R$. For the RHS, I think we can still say there is a $v'(\xi)$ as the case for the LHS. If we can proof $v(\xi)=v'(\xi)$ a.e., then the equality can hold. Am I right?2012-12-06
  • 0
    Wow wow wow! Hold on a minute... In any sense one can imagine, $E(\xi\mid\xi\in A)$ is a **number**, not a set (right, everything is a set ultimately, including numbers, but you see what I mean). And I cannot even fathom what you mean by the RHS of the identity in your comment. What you might want to do now is to find a competent introduction to conditional expectations and to study it.2012-12-06
  • 0
    Yes, you are right. The RHS is meaningless. Thank you very much. I have asked another question just now, [A problem of regular distribution](http://math.stackexchange.com/questions/252844/a-problem-of-regular-distribution), which is the origination of this question. Thank you very much!2012-12-07
  • 0
    By the way, could you recommend me some material about the regular conditional probabilities/distributions. The book mentioned by StefanHansen is difficult to be finded for me. All I learn the concept is from Shiryaev's _Probability_, I feel it is very hard for me. Thank you very much!2012-12-07
  • 0
    David Williams, *Probability with martingales*.2012-12-07
1

Let me try and follow the path of the book I mentioned in the comments. This is not an answer (but was to long to fit in a comment) but hopefully it will shed some light on the problem.

Let $(\Omega,\mathcal{F},P)$ be a probability space and $(M,\mathcal{B})$ a measurable space. Let $T:\Omega\to M$ be ($\mathcal{F},\mathcal{B})$-measurable and let $X\in L(P)$. Then we define the conditional expectation of $X$ given $T=t$ to be that/those measurable function(s) $\varphi: M\to\mathbb{R}$ satisfying $$ \int_B\varphi(t)\, P_T(\mathrm dt)=\int_{T^{-1}(B)}X\,\mathrm dP,\quad B\in\mathcal{B}.\qquad (*) $$ We will write $E[X\mid T=t]:=\varphi(t)$ for any measurable solution to $(*)$. Then $E[X\mid T]=\varphi(T)$ if $\varphi(t)=E[X\mid T=t]$ for all $t\in M$ and $\varphi(t)$ is unique for $P_T$-a.a. $t$.

Let us turn to regular conditional distributions. Let $(L,\mathcal{A})$ be another measurable space and $S:\Omega\to L$ be $(\mathcal{F},\mathcal{A})$-measurable. A regular conditional distribution of $S$ given $T$ is a Markov kernel $P_S^T$ on $(L,\mathcal{A}\mid M,\mathcal{B})$ such that $P_S^T(A\mid t)$ is a conditional distribution of $S$ given $T$, i.e.

1) $Q(\cdot\mid t)$ is a probability measure on $(L,\mathcal{A})$ for all $t\in T$,

2) $Q(A\mid \cdot)$ is $\mathcal{B}$-measurable for all $A\in\mathcal{A}$,

3) $P(S\in A,T\in B)=\int_B P_S^T(A\mid t)\,P_T(\mathrm dt)$ for all $A\in\mathcal{A}$ and $B\in\mathcal{B}$.

If $\psi:L\times M\to \mathbb{R}$ is $(\mathcal{F},\mathcal{A}\otimes\mathcal{B})$-measurable such that $\psi(S,T)\in L(P)$, then

$$E[\psi(S,T)]=\int_M \int_L \psi(s,t)\;P_S^T(\mathrm ds\mid t)\, P_T(\mathrm dt),$$

$$E[\psi(S,T)\mid T=t]=\int_L\psi(s,t)\, P_S^T(\mathrm ds\mid t),$$

$$E[\psi(S,T)\mid T=t]=E[\psi(S,t)\mid T=t]$$

provided that in the equality we use the regular conditional distribution $P_S^T$ to compute the condition expectations.

Isn't it exactly the last equality we are looking for?

  • 0
    I think did's answer is right, the RHS is meaningless. Could you recommend me some material about the regular conditional probabilities/distributions. The book you mentioned is difficult to be finded for me. Want to study this concept more thoroughly. Thank you!2012-12-07
  • 0
    @Danielsen: I think did's recommendation is perfect for you :)2012-12-07