3
$\begingroup$

I am wondering if someone knows of a lower bound on the differential entropy of a mixture of two zero-mean Gaussians:

$$h(X)=-\int_{-\infty}^{\infty} f_X(x)\log f_X(x)dx$$

where

$$f_X(x)=\frac{1-\epsilon}{\sqrt{2\pi}\sigma_1}e^{-x^2/2\sigma_1^2}+\frac{\epsilon}{\sqrt{2\pi}\sigma_2}e^{-x^2/2\sigma_2^2}$$

I've tried the trivial lower bound obtained by replacing $\log f_X(x)$ with $f_X(x)-1$, but it's not tight enough. Any suggestions?

1 Answers 1

3

The entropy is a concave function of the probability distribution of the random variable $X$. In other words, if $p_1$ and $p_2$ are probability distributions, then $H(\epsilon p_1 + (1-\epsilon)p_2) \geq \epsilon H(p_1) + (1-\epsilon)H(p_2)$.

You can apply this to get a basic lower bound on your distribution. In your case, $H(X) \geq (1-\epsilon) \log(2\pi e \sigma_1^2) + \epsilon \log(2\pi e \sigma_2^2)$.

Unfortunately, this bound is also not very tight (except in certain regimes), and its usefulness will depend on where you are applying it.

  • 0
    Can't believe I forgot that the entropy is a concave function of the probability distribution... thanks for the reminder! This bound is indeed loose, but it seems to work in my problem. :)2011-10-18
  • 0
    Is the differential entropy also a concave function of the probability distribution?2015-05-08