I'm trying to determine a formulation for the expected number of rounds for a game of chance based on dice.
The details of the game are as follows:
The game initially consists of $N$ players
Each player has a fair die or access to one.
For each round each player will roll their die once
The mean of the rolls is determined
Any player that rolled a value less than the mean is eliminated from the game
The rounds are repeated until their is only player left.
My question is how does one determined the expected number of round in terms of $N$.
My thinking is that for any round the expectation is that half of the players will be eliminated. This leads to a continuous halving of the number of players, which seems to suggest that $\text{E}(\text{Rounds}) \approx \log_2 {N}$, or specifically:
$$ \lim_{N \to \infty} \text{E}(\text{Rounds}) =\log_2 {N} $$
I wasn't able to formulate a definitive relation, so I ran a simple monte-carlo of the game, for numbers of players ranging from 2 players to 200 players, one million simulations per game size. The results can be found here:
https://gist.github.com/anonymous/f0db85f06343070045b78f7494f19565
The graph for the results and $\log_2 {N}$ is shown in the following as red and blue respectively:
Where I'm having difficulty is explaining the continuous and consistent over-estimate of the result curve (via simulation) when compared to the $\log_2{N}$ curve.


