For an RV $X$ with values on $\{1,2,\ldots\}$, I need to prove that the entropy is less than the EV: $H(X)\leq E(X)$ . I tried to bound the log but I'm not quite there. Appreciate any hint...
Thanks
For an RV $X$ with values on $\{1,2,\ldots\}$, I need to prove that the entropy is less than the EV: $H(X)\leq E(X)$ . I tried to bound the log but I'm not quite there. Appreciate any hint...
Thanks
As requested, I'm answering my own question:
It is known that the Divergence holds $D(P||Q)\geq 0$ , ie $\sum p_i \log \frac{p_i}{q_i}\geq 0$. so:
$\sum p_i \log p_i \geq \sum p_i \log q_i$
$-\sum p_i \log p_i \leq \sum p_i \log q_i^{-1}$
Choosing a probability distribution q: $q_i = 2^{-i}$ yields:
$-\sum p_i \log p_i \leq \sum p_i \log 2^i=\sum p_i \cdot i=EX$. Therefore: $H(X)\leq EX$
PS, $q$ is a distribution since $0\leq q_i\leq 1$ and $\sum q_i=1$