I'm having trouble understanding the following calculation we've learned in class. The objective was to prove a log-Sobolev inequality.
We did it this way: We know that $ \int_\Omega \sqrt{ | \nabla(\epsilon f^2 ) | ^2 + (D \cdot I(\epsilon f^2) )^2 } d\mu \geq D I\left(\int_\Omega \epsilon f^2 d \mu\right) $ for some constant $D>0$ ($ \Omega$ is a region, $\mu$ is some measure on it, and $I(\epsilon ) = \sqrt{2} \epsilon \sqrt{\log(1/\epsilon)} $ when $\epsilon \to 0 $ ). The lecturer then said that by taking $ \epsilon \to 0 $ , we get: $ \frac{D^2 } {2} \left( \int_\Omega f^2 \log f^2 d\mu -\int_\Omega f^2 d\mu \cdot \log \int_\Omega f^2 d\mu \right)\leq \int _\Omega | \nabla f| ^2 d\mu $ (The LHS is excatly the entropy of $f^2$)
Can someone help me understand the calculation?
Thanks !