Suppose $\Omega$ be an open subset of $\mathbb R^n$ and $a_h$ be a sequence of $L^\infty(\Omega)$ functions such that there exist two constants $0
in $w^*-L^\infty(\Omega)$.
Prove that in general $b\leq a$ and $b=a$ implies that $a_h\to a$ strongly in $L^2(\Omega)$.
I'm pretty sure this should be aneasy consequence of Banach Steinhaus, however in my computations I always end up with the equality reversed. Probably I'm messing something with limit superiors and inferiors. Can anybody help me?
Edit
Ok so thanks to the comment below the map $x\mapsto \frac 1x$ is CONVEX then by Jensen and the Weak star convergence in $L^\infty(\Omega)$ we now that for almost any $x\in\Omega$ everything is differentiable according to the LEbesgue differentiation theorem and then $|B_r|\left(\int_{B_r}a_h\mathrm d\mu\right)^{-1}\leq \frac{1}{|B_r|}\int_{B_r}\frac{1}{a_h}\mathrm d\mu.$ Passing to the limit as $r\to 0$ around a Lebesgue point we eventually get $b\leq a$. Am I right?