2
$\begingroup$

I have a PDE in the bounded domain $\Omega$: $-\Delta u+ a(x)u=0$ with $u=0$ on $\partial \Omega$, and $a(x)>0$.

How do I show that $u\equiv 0$ in $\Omega$?

I think I should use maximum principle somewhere but I do not know how to apply it.

1 Answers 1

3

Multiplying $u$ to $-\Delta u+ a(x)u=0$, and then integrating it over $\Omega$, we obtain $$\int_\Omega a(x)u^2=\int_\Omega u\Delta u.$$ By integrating by parts, the right hand side is equal to $$\int_\Omega u\Delta u=-\int_\Omega |\nabla u|^2+\int_{\partial\Omega}u\frac{\partial u}{\partial n}.$$ Since $u=0$ on $\partial\Omega$, the last term in the above equality is zero. Hence, combining the above equalities, we get $$\int_\Omega a(x)u^2=-\int_\Omega |\nabla u|^2.$$ Since $a(x)>0$, we have $u\equiv 0$ in $\Omega$.