6
$\begingroup$

Let $p=(p_1,\dotsc,p_r), q=(q_1,\dotsc,q_r)$ be two different probability distributions. Define the relative entropy $$h(p||q) = \sum_{i=1}^r p_i (\ln p_i - \ln q_i)$$ Show $h(p||q)\geq 0$. I'm given the hint that I should show $-x\ln x$ is concave and then show for any concave function $f(y)-f(x)\leq (y-x)f'(x)$ holds. I rewritten the relative entropy as $$h(p||q)=\sum_{i=1}^r p_i \ln \left(\frac{p_i}{q_i}\right)= -\sum_{i=1}^r p_i \ln \left(\frac{q_i}{p_i}\right)$$ which sort of looks like $-x\ln x$, and I did show that $-x\ln x$ is concave, but I don't really understand what I'm supposed to do, or even if this hint is helpful.

  • 1
    This is called [Gibb's inequality](http://en.wikipedia.org/wiki/Gibbs%27_inequality).2011-10-04

1 Answers 1