3
$\begingroup$

Let $p_{ij}=P(X=i,Y=j)$ be the joint distribution, $P(X=i)=p_i=\sum_j p_{ij}, P(Y=j)=q_j=\sum_i p_{ij}$ be the marginal distributions, and $p_{i|j}=\frac{p_{ij}}{q_j}$ be the conditional distribution. Then the conditional entropy is defined as $$E[h(X|Y)]=-\sum_j \sum_i p_{i|j} \ln p_{i|j} q_j $$ Show $E[h(X|Y)]\leq h(X)$ where $h(X)$ is the entropy of $X$. I'm at a lost as to even know where to begin.

  • 0
    Check if this recent [post #69859](http://math.stackexchange.com/questions/69859/relative-entropy-is-non-negative) is of interest...2011-10-05
  • 0
    They're similar, but that post deals with relative entropy and this is conditional.2011-10-05
  • 0
    It is called "information can’t hurt".2017-08-30

1 Answers 1