2
$\begingroup$

$X,Y$ are two RV taking values in $[1,+\infty)$. Do we have following inequality?

$ E\left[\frac{Y}{X}\right]\geq\frac{E[Y]}{E[X]} $

  • 0
    What did you try? The very first impulse, on a question like this one, should be to check one or two examples. (Additionally, you probably mean *the inverse* of the RHS.)2012-12-19

2 Answers 2

6

The first inequality (the one in the revised version, with the inequality sign reversed)) has no chance to be true in general, for a simple counterexample consider $Y=\frac12X$ with $\mathbb P(X\geqslant2)=1$. The second inequality (the one in the revised version) has no chance to be true in general either, the case $Y=X^2$ would imply that $\mathbb E(X^2)\leqslant\mathbb E(X)^2$ and contradict Cauchy-Schwarz inequality for every nondegenerate $X$.

If the random variables $X$ and $Y$ are independent and positive, then by independence, $\mathbb E\left(\frac{Y}X\right)=\mathbb E(Y)\mathbb E\left(\frac1X\right)$ and, in full generality $\mathbb E\left(\frac1X\right)\geqslant\frac1{\mathbb E\left(X\right)}$ because the functions $x\mapsto x$ and $x\mapsto\frac1x$ are respectively nondecreasing and nonincreasing on $x\gt0$ (or by Jensen inequality since $x\mapsto\frac1x$ is also convex on $x\gt0$, but this is somewhat overkill).

Finally, for every positive integrable independent random variables $X$ and $Y$, $ \mathbb E\left(\frac{Y}X\right)\geqslant\frac{\mathbb E(Y)}{\mathbb E\left(X\right)}. $

  • 0
    @iMath I doubt such a list exists in a satisfyingly complete form.2014-01-14
4

Let $Z=Y/X$. We then have $\mathrm{E}[XZ]=\mathrm{E}[X]\,\mathrm{E}[Z]+\mathrm{Cov}[X,Z]$. Divide both sides by $\mathrm{E}[X]$, and you get $ \frac{\mathrm{E}[Y]}{\mathrm{E}[X]} =\frac{\mathrm{E}[XZ]}{\mathrm{E}[X]} =\frac{\mathrm{E}[X]\,\mathrm{E}[Z]+\mathrm{Cov}[X,Z]}{\mathrm{E}[X]} =\mathrm{E}\left[\frac{Y}{X}\right]+\frac{\mathrm{Cov}[X,Z]}{\mathrm{E}[X]}. $ From this we see that the inequality holds if the covariance between $X$ and $Z$ is negative, which will be the case if $X$ and $Y$ are independent. However, if the joint distribution of $(X,Y)$ is arbitrary, the joint distribution of $(X,Z)$ is arbitrary, and the covariance can be either positive or negative: if it is positive, the opposite inequality holds.


Example: If $X$ and $Y$ are independent and $X$ is not constant (i.e. $\mathrm{Var}[X]>0$), then $ \mathrm{Cov}[X,Z]=\mathrm{Cov}\left[X,\frac{Y}{X}\right] =\mathrm{E}[Y]\cdot\mathrm{Cov}\left[X,\frac{1}{X}\right]<0. $

One general result that might be used to show this is $ \mathrm{Cov}[U,V] =\mathrm{E}\big[\mathrm{Cov}[U,V|W]\big] +\mathrm{Cov}\big[\mathrm{E}[U|W],\mathrm{E}[V|W]\big] $ for any joint distribution $(U,V,W)$: e.g. $U=X$, $V=Y/X$, $W=X$.


Example: Let $X$ have any distribution with $\mathrm{Var}[X]>0$, and let $Y=X^2$. Then $ \mathrm{E}\left[\frac{Y}{X}\right]=\mathrm{E}[X] <\mathrm{E}[X]+\frac{\mathrm{Var}[X]}{\mathrm{E}[X]} =\frac{\mathrm{E}[X^2]}{\mathrm{E}[X]} =\frac{\mathrm{E}[Y]}{\mathrm{E}[X]} $ which is a counterexample to the question. This is just the same example as stated in the previous answer by did.

  • 0
    @did: Corrected the inequality to $E[XZ]=\ldots$: remnant of the previous error. If $X$ and $Y$ are independent, the covariance between $X$ and $Z=Y/X$ is governed by the fact that $X$ and $1/X$ are negatively related. If $(X,Y)$ is an arbitrary distribution (i.e. not independent), then $(X,Z)$ is an arbitrary distribution, and $Cov[X,Z]$ can then take on any value.2012-12-20