I am familiar with the issue of 'how should one roung .5?', and I am familiar with the conventional solutions, but I don't understand why there isn't a correct answer.
When you're formulating a rounding rule, you want to (as accurately as possible) associate a number with the nearer integer (etc). Such a rule should thus produce equal amounts of each results (for evenly distributed decimal numbers). Consider perfect random distribution of single-digit decimal numbers between 0 and 1 (0.0, 0.1, 0.2 ... 0.9). There are 10 possible numbers. For the rounded results to be even, 0.5 needs to be rounded up (or away from zero if you include negative numbers). In this example, there is no choice about it. Right?
What am I missing?
Thanks