What is the simplest proof that mutual information is always non-negative? i.e., $I(X;Y)\ge0$
Mutual Information Always Non-negative
14
$\begingroup$
probability-theory
-
0In addition, the convexity properties require the coefficients in the linear combination sum 1. Then, as p(x,y) is a probability distribution, it fullfits such condition. – 2018-08-10
1 Answers
19
By definition, $I(X;Y) = -\sum_{x \in X} \sum_{y \in Y} p(x,y) \log\left(\frac{p(x)p(y)}{p(x,y)}\right)$ Now, negative logarithm is convex and $\sum_{x \in X} \sum_{y \in Y} p(x,y) = 1$, therefore, by applying Jensen Inequality we will get, $I(X;Y) \geq -\log\left( \sum_{x \in X} \sum_{y \in Y} p(x,y) \frac{p(x)p(y)}{p(x,y)} \right) = -\log\left( \sum_{x \in X} \sum_{y \in Y} p(x)p(y)\right) = 0$ Q.E.D