I'm working on my master thesis and need to handle some spectral theory of the Laplace operator on compact Riemannian manifolds and especially on the sphere. While investigating essential self-adjointness I stumbled on the following problem.
*Problem*$\quad$ In a compact Riemannian manifold $M$ let $\Delta=\operatorname{div}\operatorname{grad}$ and let $f\in L^2(M)$ be such that $(f, u-\Delta u)=0$ for every $u \in C^{\infty}(M)$. Prove that $f=0$.
I believe that the claim is true, because the condition $(f, u-\Delta u)=0$ means exactly that $f$ is a distributional solution of the elliptic equation $-\Delta f + f=0$, and so I expect it to be a $H^2_{\text{loc}}$ function (see Theorem 2.1 of Berezin - Shubin's book). Since $M$ is compact this must imply that $f\in H^1(M)$ so that integrating by parts we get $\lVert f \rVert_{H^1}^2=(f, f)+(\operatorname{grad}f, \operatorname{grad}f)=0$.
Unfortunately Theorem 2.1 above is set in an open subset of the Euclidean space and I don't know if it is applicable verbatim in a Riemannian manifold. Can you point me to some reference on this?
Thank you.