2
$\begingroup$

In my previous question, I asked why $ E(Y |X = x) = \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{X=x} Y (\omega)P(d\omega)}{P(X = x)} = \frac{E(Y \, 1_{(X=x)})}{P(X = x)} $ when $X$ is a discrete random variable and $P(X = x) \neq 0$.

Now I would like to consider when $X$ is a continuous random variable and its density at $x$ is not $0$, i.e. $f_X(x) \neq 0$, so that $f_{Y\mid X}(y \mid x) $ and $E(Y|X=x)$can be defined, whether there is a similar relation to the case above: $ E(Y |X = x) = \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{\mathbb{R}} y f_{X,Y}(x,y) dy}{f_X(x)} = (?) $ close to representing $E(Y |X = x)$ in terms of some expectation?

Thanks and regards!

  • 0
    Great. Now, you have everything you need to answer your own question.2011-11-08

1 Answers 1

1

Note, that if $X$ is continuous, then conditioning on $\{X=x\}$ in the classical sense is not possible any more, since $\mathbb P(X=x)=0$, for all $x \in \mathbb R$.

Instead $\mathbb{E}[Y|X=x]$ is defined via the conditional expectations given $\sigma$-algebras: Recall that the conditional expectation of $Y$ given $X$ is a random variable (denoted with $\mathbb{E}[Y|X]$), which is measurable with respect to $\sigma(X)$ (and has certain other properties). From the measurability we can deduce the existence of a measurable function $h$ (which depends on $Y$) such that $ \mathbb{E}[Y|X] =h(X). $ From here, we define $ \mathbb{E}[Y|X=x] =h(x), \quad \text{for all} \ x\in \mathbb R. $

If you want to envolve the probability densities $f_X$, $f_Y$ and $f_{X,Y}$, first note that $ f_X(x)= \int_\mathbb{R} f_{X,Y}(x,y)\; dy. $ Then we can write: $ \mathbb{E}[XY]= \int_\mathbb{R} \int_\mathbb{R} x\cdot y \cdot f_{X,Y}(x,y) \; dy \; dx = \int_\mathbb{R}\underbrace{\biggl( \int_\mathbb{R} y \cdot \frac{f_{X,Y}(x,y)}{f_X(x)} \; dy \biggr)}_{=h(x)} \cdot x \cdot f_X(x)\;dx $ Maybe this answers your question.