1
$\begingroup$

In a set of lecture notes with properties on conditional variance, I found this inequality:

E[Var(y|x)] ≥ E[Var(y|x,z)]

The intuition is clear: as you add more information, the expected variance is smaller. However, I cannot find the rigorous derivation of the inequality. I don't think it is a hard one, but I cannot get the trick which will lead me to the result.

  • 0
    Hint: $\sigma(X,Z)\supset \sigma(X)$.2017-01-29
  • 0
    Thank you, but either I don't get your hint or it sounds to me just like the intuition behind it: more info less variance2017-01-30

2 Answers 2

1

Hint: By definition of conditional variance, it suffices to show: $$ E[ E(Y\mid \mathcal F)^2] \ge E[ E(Y\mid\mathcal G)^2]\qquad\text{whenever ${\mathcal G}\subset{\mathcal F}.$} $$ This in turn follows from setting $U:=E(Y\mid \mathcal G)$ and $V:=E(Y\mid\mathcal F)-E(Y\mid \mathcal G)$ in the following identity (which you should prove):

Claim: If $U$ and $V$ are square integrable and $U$ is $\mathcal G$-measurable and $E(V\mid\mathcal G)=0$, then $$ E[(U+V)^2] = E[U^2] + E[V^2]. $$

  • 0
    Thank you for the answer. However, I cannot understand it because my measure theory knowledge is quite limited. Any good reference in order to understand or an easier/different approach?2017-02-07
  • 0
    @Jack The Claim is valid if you rephrase it in terms of conditional expectations: replace conditioning on $\cal G$ by conditioning on $x$. Do you see how to modify the rest of the argument?2017-02-07
0

This follows from the conditional version of the conditional variance formula:

Var(Y | X) = E[Var(Y | X, Z) | X] + Var(E[Y | X, Z] | X) $\ge$ E[Var(Y | X, Z)| X] .

Take the expectation of both sides.