1
$\begingroup$

In http://isfaserveur.univ-lyon1.fr/~stephane.loisel/prerequis_esp_cond.pdf

Recall the definition of conditional probability associated with Bayes’ Rule

$P(A|B) ≡ \frac{P(A \cap B)}{P(B)}$

For a discrete random variable $X$ we have $P(A) = \sum_x P(A, X = x) = \sum_x P(A|X = x)P(X = x) $

and the resulting formula for conditional expectation $ E(Y |X = x) = \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{X=x} Y (\omega)P(d\omega)}{P(X = x)} = \frac{E(Y \, 1_{(X=x)})}{P(X = x)} $

I was wondering why $ \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{X=x} Y (\omega)P(d\omega)}{P(X = x)}$ in the last equation?

Thanks and regards!

1 Answers 1

2

I will give you an argument that works under two additional hypothesis:

  1. $Y\in L^1$;

  2. $\mathbb{P}(X=x)\neq 0$.

We first establish this result for characteristic functions. If $Y=1_{A}$ for some measurable set $A$ we have that $ \begin{eqnarray} \int_{\Omega} 1_A(\omega)\ \mathbb{P}(d\omega|X=x) &=&\mathbb{P}(A|X=x) \\ &=& \frac{\mathbb{P}(A\cap \{X=x\})}{\mathbb{P}(X=x)}=\frac{1}{\mathbb{P}(X=x)}\int_{\{X=x\}} 1_A(\omega) \ d\mathbb{P}(\omega). \end{eqnarray} $

By linearity of the integral you can prove this formula for any simple function. Let us suppose that $Y\in L^1$ and $Y(\omega)\geq 0$ for all $\omega\in\Omega$. In this case note that there is a monotone sequence of simple functions that converges to $Y$, by using the monotone convergence theorem we can see that the formula also holds for $Y$. For general $Y\in L^1$, just split it in its positive and negative part and argument similarly we did before.

  • 0
    @Tim nice observation. The hypothesis $Y\in L^1$ can be replaced by the condition that either positive or negative part of $Y$ is integrable.2011-10-31