In http://isfaserveur.univ-lyon1.fr/~stephane.loisel/prerequis_esp_cond.pdf
Recall the definition of conditional probability associated with Bayes’ Rule
$P(A|B) ≡ \frac{P(A \cap B)}{P(B)}$
For a discrete random variable $X$ we have $P(A) = \sum_x P(A, X = x) = \sum_x P(A|X = x)P(X = x) $
and the resulting formula for conditional expectation $ E(Y |X = x) = \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{X=x} Y (\omega)P(d\omega)}{P(X = x)} = \frac{E(Y \, 1_{(X=x)})}{P(X = x)} $
I was wondering why $ \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{X=x} Y (\omega)P(d\omega)}{P(X = x)}$ in the last equation?
Thanks and regards!