How can be proven that the entropy of a dice roll is maximized when the probability of each of its $6$ faces is equal, $1/6$?
Prove that entropy is maximized when probability is $1/n$
0
$\begingroup$
computer-science
entropy
2 Answers
1
Suprise is defined as -log(p{X=x}). A good way to think of entropy is the "expected surprise". In this sense, its easy to see that the uniform distribution maximizes the expected surprise.
-
0not sure how to do it... can you help? – 2012-10-26
1
The entropy is given by $-\sum p_i\ln p_i$. Use Jensen's inequality with the logarithm function.
-
0not sure how to do it... can you help? – 2012-10-26