I am wondering if someone knows of a lower bound on the differential entropy of a mixture of two zero-mean Gaussians:
$h(X)=-\int_{-\infty}^{\infty} f_X(x)\log f_X(x)dx$
where
$f_X(x)=\frac{1-\epsilon}{\sqrt{2\pi}\sigma_1}e^{-x^2/2\sigma_1^2}+\frac{\epsilon}{\sqrt{2\pi}\sigma_2}e^{-x^2/2\sigma_2^2}$
I've tried the trivial lower bound obtained by replacing $\log f_X(x)$ with $f_X(x)-1$, but it's not tight enough. Any suggestions?