2
$\begingroup$

In my previous question, I asked why $$ E(Y |X = x) = \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{X=x} Y (\omega)P(d\omega)}{P(X = x)} = \frac{E(Y \, 1_{(X=x)})}{P(X = x)} $$ when $X$ is a discrete random variable and $P(X = x) \neq 0$.

Now I would like to consider when $X$ is a continuous random variable and its density at $x$ is not $0$, i.e. $f_X(x) \neq 0$, so that $f_{Y\mid X}(y \mid x) $ and $E(Y|X=x)$can be defined, whether there is a similar relation to the case above: $$ E(Y |X = x) = \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{\mathbb{R}} y f_{X,Y}(x,y) dy}{f_X(x)} = (?) $$ close to representing $E(Y |X = x)$ in terms of some expectation?

Thanks and regards!

  • 1
    While we are at it, you might make up your mind between $w$ (double-u) and $\omega$ (omega). // And you know this stuff is explained competently and with details in tons of well-written textbooks, many of them available online, don't you?2011-11-03
  • 0
    Didier: Thanks! I corrected it. I guess so but fail to find where it is.2011-11-03
  • 2
    Are you kidding? Maybe the ones suggested on [this previous question](http://math.stackexchange.com/q/27789) or on [that one](http://math.stackexchange.com/q/36941) would be a good start! Same remark already [here](http://math.stackexchange.com/q/36952). Note that explanations [there](http://math.stackexchange.com/q/35899) basically answer your present question but you did not bother to follow the lead formulated as a comment. Seven months later, the result is that you seem to be still bogged down in the same elementary definition problems. Oh well...2011-11-03
  • 0
    It is not "seem to be". It is "are". Sorry, I can't see how the explanations in the fourth link "basically answer your present question". PS: I edited my post.2011-11-06
  • 0
    The fourth link explains that all this is based on the **definition** of conditional expectations and how to proceed to prove what you want here as well as in many other questions you asked about conditional expectations and distributions. You are still making circles. // Let me try once more: take any two random variables $X$ and $Y$ with $Y$ integrable. What does it mean to say that $E(Y\mid X)=u(X)$? If I give you the distribution of $(X,Y)$, say with a density $f$, how do you prove or disprove that $E(Y\mid X)=u(X)$?2011-11-06
  • 0
    I have different understanding of "What does it mean to say that $E(Y∣X)=u(X)$?" (1) If you mean proving the existence of $u$ such that $E(Y∣X)=u(X)$, then this can be proved by using problem 13.3 of Billingsley's Probability and Measure (see the first quote here http://math.stackexchange.com/questions/78501/when-can-a-measurable-mapping-be-factorized). (2) If you mean some previously-defined $u$ as $u(x):=E(Y|X=x)$ with elementary definition of conditional expectation, then $E(Y|X)=u(X)$ a.e. is proved in Section 9.6 of Williams' Probability and Martingales. Correct me if I am wrong.2011-11-08
  • 0
    "If I give you the distribution of (X,Y), say with a density f, how do you prove or disprove that E(Y∣X)=u(X)?" Do you mean the same thing as (2) in my previous comment?2011-11-08
  • 0
    The question is: if provided with the distribution of (X,Y), how does one prove or disprove that E(Y|X)=X^3+5, say? If you have no idea about *that*, all the rest is pointless.2011-11-08
  • 0
    $X^3+5$ is measurable wrt $\sigma(X)$. $\forall A \in \sigma(X)$, see if $\int_A X^3+5\,dP=\int_A Y \, dP$. How pointless is it?2011-11-08
  • 0
    Great. Now, you have everything you need to answer your own question.2011-11-08

1 Answers 1

1

Note, that if $X$ is continuous, then conditioning on $\{X=x\}$ in the classical sense is not possible any more, since $\mathbb P(X=x)=0$, for all $x \in \mathbb R$.

Instead $\mathbb{E}[Y|X=x]$ is defined via the conditional expectations given $\sigma$-algebras: Recall that the conditional expectation of $Y$ given $X$ is a random variable (denoted with $\mathbb{E}[Y|X]$), which is measurable with respect to $\sigma(X)$ (and has certain other properties). From the measurability we can deduce the existence of a measurable function $h$ (which depends on $Y$) such that $$ \mathbb{E}[Y|X] =h(X). $$ From here, we define $$ \mathbb{E}[Y|X=x] =h(x), \quad \text{for all} \ x\in \mathbb R. $$

If you want to envolve the probability densities $f_X$, $f_Y$ and $f_{X,Y}$, first note that $$ f_X(x)= \int_\mathbb{R} f_{X,Y}(x,y)\; dy. $$ Then we can write: $$ \mathbb{E}[XY]= \int_\mathbb{R} \int_\mathbb{R} x\cdot y \cdot f_{X,Y}(x,y) \; dy \; dx = \int_\mathbb{R}\underbrace{\biggl( \int_\mathbb{R} y \cdot \frac{f_{X,Y}(x,y)}{f_X(x)} \; dy \biggr)}_{=h(x)} \cdot x \cdot f_X(x)\;dx $$ Maybe this answers your question.