1
$\begingroup$

I know that the maximum possible Shannon Entropy for an alphabet $X$ is $\log|X|$, where Shannon Entropy is:

$$H(X) = - \sum_{x \in X} \; p(x) \log p(x)$$

but how is this upper limit computed?

  • 3
    No, its $\log |X|$ if $p(x)>0$ for all $x\in X$; otherwise its $|\{x:p(x)>0\}|$. Write $H(X) = \sum_{x \in X} \; p(x) \log(1/ p(x))$, and then use Jensen's inequality.2012-07-09
  • 0
    Good point, stupid fingers - will update the question to fix the $log|X|$ issue. I was trying to differentiate and set to zero, I think I see how this approach works, thank you2012-07-09

0 Answers 0