0
$\begingroup$

We know mutual information formula: $I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X)$

But how to proof $I(X;Y) \le \min (H(X),H(Y))$? Or can you give the proof or web address?

Thanks!

2 Answers 2

1

By the equations you've written, \begin{align*} H(X) \geq H(X) - H(X \mid Y) &= I(X;Y) \\ H(Y) \geq H(Y) - H(Y \mid X) &= I(X;Y) \\ \end{align*} since $H(X \mid Y)$ and $H(Y \mid X)$ are non-negative. Thus $I(X;Y) \leq H(X)$ and $I(X;Y) \leq H(Y) \Rightarrow I(X;Y) \leq \min(H(X),H(Y))$.

0

@Daniel Xiang,

Yes,from your tip,$H(X|Y) \ge 0,H(Y|X) \ge 0$,

so we can assume $\min (H(X),H(Y)) = H(X)$,

we can get $H(X|Y) = 0$,then $H(Y|X) \ge 0$

we can say

${\rm{H(X|Y)}} = - \sum\limits_{x,y} {p(x,y)\log p(x|y)} = - \sum\limits_{x,y} {p(x,y)\log \frac{{p(x,y)}}{{p(y)}} = 0 \Rightarrow p(x,y) = p(y)} \\ {\rm{H(Y|X)}} = - \sum\limits_{x,y} {p(x,y)\log p(y|x)} = - \sum\limits_{x,y} {p(x,y)\log \frac{{p(x,y)}}{{p(x)}}} = - \sum\limits_{x,y} {p(y)\log \frac{{p(y)}}{{p(x)}}} \ge 0 \\ \Rightarrow p(x) \ge p(y) $

How can we say,when $p(x,y) = p(y)$,then always $p(x) \ge p(y)$?