1
$\begingroup$

Given a first order differential equation: $ y' = y^2 +x $
Initial condition: $ y(0)= 0 $
Prove that the differential equation above has no solution on $[0,3]$.

I don't really understand how such equation has no solution on $[0,3]$.
Both $f(x,y) = y^2+x $ and $\frac{\partial f}{\partial y} $ are continuous at $[0,0]$ so the existence and uniqueness theorem says it has unique solution ?

I even used Matlab to graph the solution (click for full size image):

sol

The red curve is the solution that satisfies $y(0)=0$.

1 Answers 1

1

Picard's theorem only gives you local existence and uniqueness. The problem is that you cannot extend the solution far enough to be defined on the entirety of $[0,3]$.

One possible proof is the following:

Proof:

In the interval $[0,1]$, you have that $y^2 \geq 0$ so y' \geq x. This implies that, considering the initial data, that $y \geq \frac12 x^2$.

In the interval $[1,3]$, $x \geq 1$, so the solution solves y' > y^2 or (y^{-1})' < -1. Integrating this gives $1/y(x) < 1+ 1/y(1)-x$. From the previous step we have that $1/y(1) \leq 2$. So we must conclude that $ 1/y(x) < 3 - x $ or $y(x) > \frac{1}{3-x}$. Observe that the right hand side diverges as $x\to 3$. Hence there cannot be a solution that exists on the whole interval $[0,3]$.

  • 0
    hmm i see. I misinterpreted the question. I thought it meant prove that the differential equation has no solution if $ x\in [0,3] $. THank you.2012-03-29