1
$\begingroup$

Suppose that b(t) is continuous and $\lim_{t \to \infty} b(t)=0$ and $\int_{0}^{\infty} |b'(t)| < \infty $ prove that every solution of the differential equation y"+(1+b(t))y=0 is bounded on $[0,\infty)$.

So started in this way: converting the above equation into

$A=\begin {bmatrix} 0 &1 \\ 1+b(t) & 0 \end {bmatrix} , x'=Ax $

which $x= (x_1,x_2)$ and obviously $x_1'=x_2$ and $x_1=y'$. And then I tried to use some theorems due to bounding the solutions of linear differential equations of order 1, But I have not any progress yet.

  • 0
    These kind of problems are usually solved using [Barbalat's lemma](https://en.wikipedia.org/wiki/Lyapunov_stability#Barbalat.27s_lemma_and_stability_of_time-varying_systems)2017-01-08
  • 0
    @polfosol How Barbalat's Lemma helps on this problem? Would you mind tell me more about your idea?2017-01-08
  • 0
    Posting an answer is really hard when I'm using the mobile app! See the example at the bottom of wikipedia article. Hopefully you'll get the idea2017-01-08
  • 0
    @polfosol yeah It's OK. But I'm waiting for your Idea. Pleas post it later If you could.2017-01-08

0 Answers 0