1
$\begingroup$

Consider $n$ independent and identically distributed random variables $ \{X_i\}_{i=1,...n} $ with support on some interval $[a,b]$ and its $n$'th order statistic $\max_{i \in \{1,...n\}} X_i$ . The entropy of the maximum is

$$ - \int_a^b F^n(x) \ln F^n(x) dx ,$$ where $F(x)= \Pr (X \le x) $. It seems natural that the entropy should be decreasing in $n$ (just think about $n$ very large). Is this a known result?

I did in fact prove that the entropy is monotone, but the proof turned out to be lengthy and messy. I would expect that there is a simple argument. Does anyone know?

  • 1
    The integral quoted does not correspond to the [Shannon entropy](http://en.wikipedia.org/wiki/Entropy_%28information_theory%29) of the maximal order statistics, I am afraid. By definition the entropy $S_Z = - \int \ln(f_Z(z)) f_Z(z) \mathrm{d} z$, and for the $\max$, $f_{X_{n:n}}(x) = \left(F_X(x)^n\right)^\prime$.2012-07-15
  • 1
    If X has a density as Sasha said the cumulative distribution for the maximum is F$^n$(x) and the density f(x) =nF$^n$$^-$$^1(x)$F'(x).2012-07-15
  • 1
    Now ln f(x)= ln n + ln F'(x) + (n-1)ln F(x).2012-07-15
  • 0
    Sorry, I did use the wrong definition. -S2012-07-15

1 Answers 1

2

No, the entropy is not monotone. For example, consider $F_X(x) = x^{1/N}$ on $[0,1]$. Then $\max(X_1,\ldots,X_N)$ is uniform on $[0,1]$. The entropy of $\max(X_1,\ldots,X_n)$ increases as a function of $n$ for $1 \le n \le N$, reaching $0$ at $n=N$, then decreases after that.