I am trying to prove that the solution $u(x,t)$ to the heat equation $u_t=u_{xx}$ on an interval (a,b) which satisfies homogeneous Dirichlet boundary conditions $u(a,t)=u(b,t)=0$ and initial consition $u(x,0)=u_0(x)$ satisfies the inequality
$\int_a^b u^2(x,t)dx\leq e^{-\frac{2\pi^2}{(b-a)^2}t} \int_a^b u_0^2(x)dx$
So far, for any function $u\in \mathcal{C}^1[a,b]$ such that $u(a)=u(b)=0$, I know that the following holds
\int_a^b u^2(x)dx\leq (b-a)^2\int_a^b (u')^2(x)dx
using the fundamental theorem of calculus and Schwarz's inequality, but I am unsure as to how to proceed next.
The Fourier series expansion for $u(x,t)$ is given by
$u(x,t)=\sum_{n\geq 1} b_n e^{-\lambda_n^2t} \sin\left(\frac{n\pi}{b-a}\right)(x-a)$
where
$b_n=\frac{2}{b-a}\int_a^b u_0(x) \sin\left(\frac{n\pi}{b-a}\right)(x-a) dx, \qquad \lambda_n=\frac{n\pi}{b-a}$
so differentiating the solution and squaring doesn't seem to give the desired inequality and there is also the issue of the time dependence.