0
$\begingroup$

Let $X$ be a continuous random variable with values ranging from 0 to 1.

Let $X_{kn}$ be the random variable representing the $k$th smallest order statistic of $n$ draws from $X$. Note that $X_{kn}$ is not a sample, but the marginal distribution of the statistic.

I begin with a simple specific case below. Then I will present the more complicated general case.

I need to prove:

$$ \frac12\mathbb{E}(X_{22}+ X) < \frac13\mathbb{E}(X_{24}+X_{34}+X_{44}) $$

In other words, that the average of $X$ and the max of 2 draws of $X$ is less than the average of the top 3 marginal order distributions of 4 draws of $X$.

Any help would be tremendously appreciated.

Also, the more general case is below:

Let $c$ denote some constant integer $\geq 2$.

Let $i$ denote some integer $ \geq 1$.

Prove that $$\frac1c\mathbb{E}(X_{c^i,c^i}+ (c-1)X) < \frac1{c^{i+1}-c^i+1}\mathbb{E}\left(\sum_{k=c^i}^{c^{i+1}}X_{k,c^{i+1}}\right) $$ And, preferably, that the magnitude of the inequality increases with $i$.

Also note: I posted a similar problem on stats.SE last week (still unanswered). This one is different enough to warrant its own question through.

  • 1
    $k$-th smallest in the sample of size $n$ is usually denoted $X_{k:n}$. The sum $\sum_{k=1}^n X_{k:n} = \sum_{i=1}^n X_i$ where $X_i$ are now i.i.d.s Thus your inequality translated into $m_{ci}/c + (1-c)/c m_1 < m_1$, where $m_k$ is a moment of order $k$ of distribution followed by $X$2011-08-14
  • 0
    Thanks for the help... if I understand what you wrote, then that would imply that the right side of the inequality is the average of all the order statistics, when it is really only subset of them. Did I misunderstand? I also edited my question with a more specific example since the subscripts can be really hard to read.2011-08-14
  • 0
    Yes, I missed that the sum is partial, sorry. It seems like you mean to write $X_{c:i}$ instead of $X^{ci}$.2011-08-14
  • 0
    No, I meant $X^{ci}$ . It's just that it's the equivalent of $X_{ci. ci}$ so I thought it would be simpler in the example. In the example, $c=2$ and $i=1$ so $c^i = 2$ and $X_{22} = X^{ci} = X^2$2011-08-14
  • 1
    No, $X_{2:2}$ is not equivalent to $X^2$ in distribution. Consider uniform. Then $X_{2:2}$ has $pdf_{X_{2:2}}(x) = 2x$, while $X^2$ has $pdf_{X^2}(x) = \frac{1}{2\sqrt{x}}$.2011-08-14
  • 0
    Oh! I was confusing $F(x)$ with $X$... where the cdf of $X_{22} = F(x)^2$ when the cdf of $X = F(x)$. Ooops... will correct.2011-08-14
  • 0
    @JandR: I tried to make your post easier to read, please check that no mathematical error slipped in with my modifications. // The RHS of the inequality you want to prove in the general case is dubious because the denominator is **not** the number of terms of the sum in the numerator. Hence you might wish to replace the denominator by $c^{i+1}-c^i+1$.2011-08-14
  • 0
    @Didier -- thank you for your very nice edits, and for pointing out the error (which I fixed per your suggestion). That is helpful to know even though this has been disproved.2011-08-14

1 Answers 1

1

I have found a counterexample to your purported inequality $\frac{1}{2} ( \mathbb{E}(X_{2:2}) + \mathbb{E}(X) ) < \frac{1}{3} ( \mathbb{E}(X_{2:4}) + \mathbb{E}(X_{3:4}) + \mathbb{E}(X_{4:4}) ) = \frac{1}{3}( 4 \mathbb{E}(X) - \mathbb{E}(X_{1:4}) $.

Consider $X$ which follows $Beta(\alpha, 2)$. Then $\mathbb{E}(X) = \frac{\alpha}{2+\alpha}$ and $\mathbb{E}(X_{2:2}) = \frac{\alpha ( 4 \alpha+3)}{(2\alpha+1)(2\alpha+3)}$ and

$$ \mathbb{E}(X_{1:4}) = \frac{\alpha ^4 \left(6912 \alpha ^5+34810 \alpha ^4+69601 \alpha ^3+68919 \alpha ^2+33734 \alpha +6524\right)}{(\alpha +2) (2 \alpha +1) (2 \alpha +3) (3 \alpha +1) (3 \alpha +2) (3 \alpha +4) (4 \alpha +1) (4 \alpha +3) (4 \alpha +5)} $$

With this results, your inequalities is only satisfied when $\alpha > 0.51$.

In case you find this useful, here is Mathematica code performing this computations:

In[40]:= ineq = 
  With[{dist = 
     BetaDistribution[a, 
      2]}, (Expectation[x, 
        x \[Distributed] OrderDistribution[{dist, 2}, 2]] + 
       Mean[dist])/2 < 
    1/3 (4 Mean[dist] - 
       Expectation[x, 
        x \[Distributed] OrderDistribution[{dist, 4}, 1]])];

In[41]:= Reduce[ineq && a > 0, a, Reals] // N

Out[41]= a > 0.512761
  • 0
    Oh no!! Thanks so much for figuring this out and pointing it out. I need to process this for a bit. The actual distributions I need to prove it on will all be iterations on $F(x) = Max2(Min2(x))$, or $Max2(Min2(Max2(Min2(x))))$, or $Max2(Min2(Max2(Min2(Max2(Min2(x)))))$... etc. Where $Max2(x) = x^2$ and $Min2(x) = 1-(1-x)^2$. So my the cdfs of my $X$s will look like $(1-(1-x)^2)^2$ and $(1-(1-(1-x)^2)^2)^2$, etc. Hopefully I can still prove it in that case....2011-08-14
  • 0
    PS, I really appreciate your answer and will vote it up if I even get enough points to be able to :)2011-08-14
  • 0
    Ok, I have added a new constraint on $X$ in the question.2011-08-14
  • 0
    @JandR: Please **DO NOT DO THAT**. Now it seems Sasha's post does not answer your question. Much better to add a paragraph **Edit** at the end of your post, describing the modified version of the question, so that the original version (to which Sasha answered) is still there for all to see.2011-08-14
  • 0
    ok. it has benn undone.2011-08-14