2
$\begingroup$

$X,Y$ are two RV taking values in $[1,+\infty)$. Do we have following inequality?

$$ E\left[\frac{Y}{X}\right]\geq\frac{E[Y]}{E[X]} $$

  • 0
    What did you try? The very first impulse, on a question like this one, should be to check one or two examples. (Additionally, you probably mean *the inverse* of the RHS.)2012-12-19

2 Answers 2

6

The first inequality (the one in the revised version, with the inequality sign reversed)) has no chance to be true in general, for a simple counterexample consider $Y=\frac12X$ with $\mathbb P(X\geqslant2)=1$. The second inequality (the one in the revised version) has no chance to be true in general either, the case $Y=X^2$ would imply that $\mathbb E(X^2)\leqslant\mathbb E(X)^2$ and contradict Cauchy-Schwarz inequality for every nondegenerate $X$.

If the random variables $X$ and $Y$ are independent and positive, then by independence, $\mathbb E\left(\frac{Y}X\right)=\mathbb E(Y)\mathbb E\left(\frac1X\right)$ and, in full generality $\mathbb E\left(\frac1X\right)\geqslant\frac1{\mathbb E\left(X\right)}$ because the functions $x\mapsto x$ and $x\mapsto\frac1x$ are respectively nondecreasing and nonincreasing on $x\gt0$ (or by Jensen inequality since $x\mapsto\frac1x$ is also convex on $x\gt0$, but this is somewhat overkill).

Finally, for every positive integrable independent random variables $X$ and $Y$, $$ \mathbb E\left(\frac{Y}X\right)\geqslant\frac{\mathbb E(Y)}{\mathbb E\left(X\right)}. $$

  • 0
    I was not aware of Jensen inequality before, thank you very much!2012-12-19
  • 0
    @Did could you clarify what you mean when you say "because the functions x↦x and x↦1/x are respectively nondecreasing and nonincreasing on x>0"2014-01-14
  • 0
    @Did could you clarify what you mean when you say "because the functions x↦x and x↦1/x are respectively nondecreasing and nonincreasing on x>0". If X is 0 with prob 1/2 and 1 otherwise, and the function g is a quarter circle with g(0)=1, g(1)=0 then E[g(X)]=1/2 < g(1/2).2014-01-14
  • 0
    @iMath And so what? If $g$ is nonincreasing and $h$ is nondecreasing then $E(g(X)h(X))\leqslant E(g(X))E(h(X))$. Try $g:x\mapsto1/x$ and $h:x\mapsto x$.2014-01-14
  • 0
    @Did so you're using that inequality, I don't see how Jensen is overkill but that inequality is not.2014-01-14
  • 0
    @iMath Nondecreasingness/nonincreasingness vs convexity, pick your choice.2014-01-14
  • 0
    @Did, fair enough, it's a nice inequality. Do you happen to know of a list of expectation inequalities such as this? or just other useful forms of writing expectation (an example might be the well-known $E[X]=\sum_{r=1}^{\infty} P(X \geq r)$)?2014-01-14
  • 0
    @iMath I doubt such a list exists in a satisfyingly complete form.2014-01-14
4

Let $Z=Y/X$. We then have $\mathrm{E}[XZ]=\mathrm{E}[X]\,\mathrm{E}[Z]+\mathrm{Cov}[X,Z]$. Divide both sides by $\mathrm{E}[X]$, and you get $$ \frac{\mathrm{E}[Y]}{\mathrm{E}[X]} =\frac{\mathrm{E}[XZ]}{\mathrm{E}[X]} =\frac{\mathrm{E}[X]\,\mathrm{E}[Z]+\mathrm{Cov}[X,Z]}{\mathrm{E}[X]} =\mathrm{E}\left[\frac{Y}{X}\right]+\frac{\mathrm{Cov}[X,Z]}{\mathrm{E}[X]}. $$ From this we see that the inequality holds if the covariance between $X$ and $Z$ is negative, which will be the case if $X$ and $Y$ are independent. However, if the joint distribution of $(X,Y)$ is arbitrary, the joint distribution of $(X,Z)$ is arbitrary, and the covariance can be either positive or negative: if it is positive, the opposite inequality holds.


Example: If $X$ and $Y$ are independent and $X$ is not constant (i.e. $\mathrm{Var}[X]>0$), then $$ \mathrm{Cov}[X,Z]=\mathrm{Cov}\left[X,\frac{Y}{X}\right] =\mathrm{E}[Y]\cdot\mathrm{Cov}\left[X,\frac{1}{X}\right]<0. $$

One general result that might be used to show this is $$ \mathrm{Cov}[U,V] =\mathrm{E}\big[\mathrm{Cov}[U,V|W]\big] +\mathrm{Cov}\big[\mathrm{E}[U|W],\mathrm{E}[V|W]\big] $$ for any joint distribution $(U,V,W)$: e.g. $U=X$, $V=Y/X$, $W=X$.


Example: Let $X$ have any distribution with $\mathrm{Var}[X]>0$, and let $Y=X^2$. Then $$ \mathrm{E}\left[\frac{Y}{X}\right]=\mathrm{E}[X] <\mathrm{E}[X]+\frac{\mathrm{Var}[X]}{\mathrm{E}[X]} =\frac{\mathrm{E}[X^2]}{\mathrm{E}[X]} =\frac{\mathrm{E}[Y]}{\mathrm{E}[X]} $$ which is a counterexample to the question. This is just the same example as stated in the previous answer by did.

  • 2
    Why is the first covariance nonnegative?2012-12-19
  • 0
    @did: Because I was experiencing a moment of temporary insanity...I had erroneously written Var instead of Cov, and so automatically declared it nonnegative, and then forgotten to remove the inequality as I had corrected Var to Cov. Will fix it now.2012-12-19
  • 0
    The first inequality in your post seems to be an equality. More importantly, why is the covariance of $X$ and $Z$ negative when $X$ and $Y$ are independent? Note that the fact that $X$ and $Y$ are independent does not imply that $X$ and $Z$ are independent. Next sentence, how do you know that the covariance can be either positive or negative? *Riddles in the dark...*2012-12-20
  • 0
    @did: Corrected the inequality to $E[XZ]=\ldots$: remnant of the previous error. If $X$ and $Y$ are independent, the covariance between $X$ and $Z=Y/X$ is governed by the fact that $X$ and $1/X$ are negatively related. If $(X,Y)$ is an arbitrary distribution (i.e. not independent), then $(X,Z)$ is an arbitrary distribution, and $Cov[X,Z]$ can then take on any value.2012-12-20