I have to consider the following Cauchy problem $\begin{cases}u''(t)+u(t)=a(t)u(t)\\ u(0)=1\\ u'(0)=0,\end{cases}$ where $a\in C([0,+\infty))$ and $\int_0^{+\infty}|a(t)|\mathrm dt<\infty.$ I am asked to show that THE solution to the Cauchy problem above is bounded on $[0,+\infty)$.
In my attempt I tried to avoid any information about the RHS so I called it $f(t)=a(t)u(t)$ and I used the method of the variation of the parameters. What I got is that the solution to the Cauchy problem $\begin{cases}u''(t)+u(t)=f(t)\\ u(0)=1\\ u'(0)=0,\end{cases}$ is given by $u(t)=-\cos(t)\int_0^t f(\xi)\sin(\xi)\mathrm d\xi+\sin(t)\int_0^tf(\xi)\cos(\xi)\mathrm d\xi+\cos(t).$
Then I substitute back $a(\xi)u(\xi)=f(\xi)$ in the above and I tried to deduce some consequences, but unfortunately with no success. Am I on the right direction? Can you give any Hint please? even answer are accepted of course.
Regards,
-Guido-