1
$\begingroup$

This comes from the book "A Probability Path". I'm just working through the problems trying to get a grasp of conditional expectations.

Suppose $X,Y$ are random variables with finite second moments such that for some decreasing function $f$ we have $E(X|Y)=f(Y)$. Show that $\operatorname{Cov}(X,Y)\leq 0$.

I can't figure out how to relate $E[X|Y]$ to $E[XY]$ which was my approach. If I could get a hint for this or how the decreasing function affects $E[X],E[Y],E[XY]$. Is there some work with bounding involved?

  • 0
    $E[X\mid Y]$ is the minimum-mean-square-error (MMSE) estimate of $X$ given the value of $Y$ and is given to be a decreasing function. The _linear_ MMSE estimator of $X$ given the value of $Y$ is a straight line (passing through the mean point $(\mu_X, \mu_Y)$) whose slope has the same _sign_ as $\text{cov}(X,Y)$. So intuitively one expects that \text{cov}(X,Y) < 0 so that the straight line is roughly "parallel" to $f(y)$. I think @Henry's answer formalizes this idea as $\text{cov}(Y, f(Y)) \leq 0$.2011-12-02

1 Answers 1

1

You could try something like the Law of total expectation to get $E[XY]=E[E[XY|Y]]=E[Y\,E[X|Y]]$

Added: So we now want to show $\operatorname{Cov}(Y,f(Y))\leq 0$. Let $Z$ be independent of $Y$ but identically distributed $Y$. Then $(Y-Z)(f(Y)-f(Z)) \le 0$ since $f()$ is decreasing.

So $E[(Y-Z)(f(Y)-f(Z))]\le 0$, and $E[(Y-E[Y]-(Z-E[Z]))(f(Y)-E[f(Y)]-(f(Z)-E[f(Z)]))]\le 0$ but since $Y$ and $Z$ are iid, this implies $E[(Y-E[Y])(f(Y)-E[f(Y)])]\le 0$ which is $\operatorname{Cov}(Y,f(Y))\leq 0$

  • 0
    How do you prove then that $\operatorname{Cov}(Y,f(Y))\leq 0$ for decreasing $f$?2011-12-02