7
$\begingroup$

Given Random variables $X$ and $Y$ is it true always that;

$$E(XY)^2 \le E(X^2)E(Y^2)$$

Is it easy to prove?

  • 0
    You may want to look up the Cauchy-Schwarz inequality.2012-12-18
  • 0
    A technicality: You need the LHS to be well-defined in the first place, which it might not be even though the RHS always is.2012-12-18
  • 0
    If $X$ or $Y$ uses the Cauchy distribution, the RHS may not be defined either.2012-12-18
  • 0
    @cardinal: Cauchy-Schwarz implies that the left side is well-defined whenever the right side is finite.2012-12-18
  • 0
    @Robert: Of course that's true, but that's not the point! (The RHS need *not* be finite and, yet, in certain such cases CS will still hold and in others the RHS will be well-defined and the LHS will not.)2012-12-18
  • 2
    @Mario: As long as $X$ and $Y$ are *random variables* (i.e., measurable), the RHS is *always* well-defined. It simply need not be finite (your Cauchy example, for instance).2012-12-18

3 Answers 3

19

I'm going to assume you mean $$(E(XY))^2 \le E(X^2)E(Y^2).$$ One way to prove this is to realize it's a special case of the Cauchy--Schwarz inequality.

Here's another. Let $$ f(t) = E((tX+Y)^2) = (E(X^2)) t^2 + 2(E(XY))t + E(Y^2) = at^2 + bt + c. $$ where $t$ is "constant", i.e. not random. Clearly $E((tX+Y)^2)\ge0$ for all real values of $t$. Now recall that for real $a,b,c$, the polynomial $at^2 + bt+c$ remains non-negative as $t$ changes if and only if $a\ge0$ and the discriminant $b^2-4ac\le0$. So $$ b^2-4ac = 4E(XY)^2 - 4E(X^2)E(Y^2). $$ So $$ 4(E(XY)^2 - E(X^2)E(Y^2))\le0. $$ Divide both sides by $4$ and there you have it.

  • 0
    This is a nice proof of Cauchy Schwarz. I had forgotten about that one.2012-12-18
  • 0
    A concise proof. make very senseeee^_^2015-09-08
3

The expectation of a product of random variables is an inner product, to which you can apply the Cauchy-Schwarz inequality and obtain exactly that inequality. Hence the answer is yes.

See http://en.wikipedia.org/wiki/Cauchy%E2%80%93Schwarz_inequality#Probability_theory

0

This is known as the Cauchy Schwarz inequality for Random Variables.