Consider $-\Delta$ defined in $H^2(\Omega)\cap H_0^1(\Omega)$, $\Omega$ a smooth bounded domain of $\mathbb{R}^n$.
Let $g\in L^{\infty}(\Omega)$, $a\leq g(x)\leq b$. Show that, if $\lambda_1<\lambda_2\leq \dots \leq \lambda_i\leq \lambda_{i+1} \leq \dots$ are the eigenvalues of $-\Delta$, and $\alpha_1 < \alpha_2 \leq \dots \leq \alpha_i \leq \alpha_{i+1} \leq \dots$ are the eigenvalues of $-\Delta - g$, where $(-\Delta -g)u=-\Delta u -gu$, then, for every $i$, $\lambda _i - b\leq \alpha_i \leq \lambda_i - a$.
Note that $-\Delta -g$ is elliptic and symetric.
Is it possible to show it?
This problem can be rephrased in terms of Perturbation Theory, but I'm not familiar with it.
Thanks for your help!