4
$\begingroup$

suppose $X_1,X_2,\ldots,X_n$ is a random sample of distribution with positive values where $E(X)=\text{Var}(X)=1$. We show order statistics of this random sample with $Y_1,Y_2,\ldots,Y_n$. How can show

  1. $E\left(\displaystyle\sum_{i=1}^n \frac{Y_i}{X_i}\right)\geq n$

  2. $E\left(\displaystyle\sum_{i=1}^n Y_iX_i\right)\leq n+n^2$

3 Answers 3

2

Here is direct way to establish the second inequality, although Robert's exact result implies it.

Since $X$ is a positive random variable, $ \sum_{k=1}^n Y_k X_k \leqslant \sum_{k=1}^n Y_k \sum_{i=1}^n X_i = S^2 $ Taking the expectation: $ \mathbb{E}\left(\sum_{k=1}^n Y_k X_k\right) \leqslant \mathbb{E}\left(S^2\right) = n^2+n $ where $\mathbb{E}\left(S^2\right) = \mathbb{Var}(S)+\mathbb{E}(S)^2 = n + n^2$ was used.

4

Let $S = X_1 + \ldots + X_n = Y_1 + \ldots + Y_n$. We have $\text{Var}(S) = E[S] = n$. Given $(Y_1,\ldots, Y_n)$, $(X_1,\ldots,X_n)$ is a random permutation of $(Y_1,\ldots,Y_n)$. Then $E[Y_i X_i | Y_1,\ldots,Y_n] = Y_i (Y_1 + \ldots + Y_n)/n$ so $E[Y_i X_i ] = E[Y_i (Y_1 + \ldots + Y_n)/n] = E[Y_i S]/n$. Now $ E \left[ \sum_{i=1}^n Y_i X_i \right] = E \left[S^2/n \right] = (\text{Var}(S) + E[S]^2)/n = (n + n^2)/n = 1 + n$

EDIT: Similarly, $E\left[\left. \sum_{i=1}^n \frac{Y_i}{X_i} \right| Y_1, \ldots,Y_n\right] = \sum_{i=1}^n E\left[\left.\frac{Y_i}{X_i} \right| Y_1,\ldots,Y_n\right] = \frac{1}{n} \sum_{i=1}^n \sum_{j=1}^n \frac{Y_i}{Y_j}$ Now note that $\displaystyle \sum_{i=1}^n \sum_{j=1}^n \frac{Y_i}{Y_j} = \sum_{j=1}^n \frac{X_i}{X_j}$, so $E \left [ \sum_{i=1}^n \frac{Y_i}{X_i} \right] = \frac{1}{n} E \left[\sum_{i=1}^n \sum_{j=1}^n \frac{Y_i}{Y_j}\right] = \frac{1}{n} \sum_{i=1}^n \sum_{j=1}^n E\left[\frac{X_i}{X_j}\right]$ For the $n$ terms where $i = j$, $X_i/X_j = 1$. For the other $n^2-n$ terms, $X_i$ and $X_j$ are independent. Thus $ E \left [ \sum_{i=1}^n \frac{Y_i}{X_i} \right] = 1 + (n-1) E[X] E[1/X]$ By Jensen's inequality and the convexity of $1/x$, $E[1/X] \ge 1/E[X]$, so this gives you your inequality. Moreover, this estimate is best possible, as can be seen by taking $X = \cases{ 1 + \sqrt{(1-p)/p} & with probability $p$\cr 1 - \sqrt{p/(1-p)} & with probability $1-p$\cr}$ Then $E[X] = \text{Var}(X) = 1$, while $E[1/X] \to 1$ as $p \to 0+$.

3

The first one follows easily from the rearrangement inequality: ${Y_1\over X_1}+\cdots+{Y_n\over X_n}\geq {Y_1\over Y_1}+\cdots+{Y_n\over Y_n}=n.\tag1$ Now take expectations in (1).

  • 0
    This shows that $\mathbb{P}\left(\sum_{k=1}^n \frac{Y_k}{X_k} \geqslant n \right) = 1$ I guess OP either implicitly assumes that the expectation is defined, or understands that if it diverges, it diverges to $+\infty$.2012-09-01