This should be a coment to Craig's answer. I agree that a decrease by $\log(n!)$ seems the right answer, almost obvious from the "information" point of view - but we don't have a proof yet.
Assuming we have $n$ iid random variables $x_i$, we know the joint entropy of $(x_1, x_2 \cdots x_n)$ is $n H(x)$. The question amounts to compute the joint entropy of the ordered variables $(y_1, y_2 \cdots y_n)$ , where $y_1 =x_{(1)}$ is the first order statistic , etc. This is in principle doable, but I don't see how a general result should be obtained.
To put an example: assume $x_i$ is uniform in $[0,M]$ (continuous variables are more tricky in regards to entropy interpretations, but here the math is a little simpler, and as long as we dont change scales, we are allowed to compare entropies; anyway, this also could be done with discrete variables). We have a join (differential) entropy of $H_0 = n \log(M)$ The ordered variable $y$ can equivalently be specifed by the differences $(z_1,z_2 ... z_n)$ where $z_i=y_i-y_{i-1}$. We know that, for large $n$, $z_i$ are asympotically independent, and exponential with mean $M/n$; the joint entropy then tends to $H_1 = n [1-\log(n/M)] \approx - n \log(n) + n \log(M)$. Then, the decrease of entropy that results from ordering is about $n \log(n) \approx \log(n!)$