2
$\begingroup$

Try to solve the equation \[ c_1 \sqrt{f(x)} + c_2f'(x) = c_3 \sqrt{f(x)} f''(x) \] holds for all $x \ge 0$. There might be another condition: $f(0) = 0$.

It is introduced from a high school physics exam problem on $s, v, a$. The answer to the problem makes a hypothesis that the motion is uniformly accelerated motion and checks and says that it is true. It is equivalent to only check when $f(x) = (\alpha x + \beta)^2$ where $\alpha, \beta \ge 0$, then the equation becomes \[ c_1 (\alpha x + \beta) + 2c_2 \alpha (\alpha x + \beta) = 2c_3 \alpha (\alpha x + \beta) \] and find a solution with $\alpha, \beta \ge 0$, saying proved. I don't think it's a rigorous proof.

I wonder whether the equation can be solved rigorously?

Thanks for help.

  • 0
    WolframAlfa suggests that $f$ is found implicitly.2012-05-26

2 Answers 2

2

The suggested solution is the right answer. This can be seen in the following way. The given equation \[ c_1 \sqrt{f(x)} + c_2f'(x) = c_3 \sqrt{f(x)} f''(x) \] can be written down as \[ c_1 + c_2\frac{d}{dx}\sqrt{f(x)} = c_3 f''(x) \] and so \[ c_1 =\frac{d}{dx}\left[ c_3 f'(x)- c_2\sqrt{f(x)}\right] \] that can be immediately integrated to give \[ c_1x+c_0 =c_3 f'(x)- c_2\sqrt{f(x)}. \] It is straightforward matter to verify that the solution to this equation has the form $f(x)=(\alpha x+\beta)^2$ as stated.

  • 0
    Maybe Gronwall's result is helpful: Consider the differential system $y^\prime=\phi(x,y)$ where $\newcommand\abs[1]{\left\lvert#1\right\rvert}\abs{\phi(x,y_1)-\phi(x,y_2)}\le A\abs{y_1-y_2}$, say, uniformly Lipschitz and some initial value is given, then the solution is unique.2013-02-04
2

I'm no expert on proving uniqueness of solutions to nonlinear ODEs, but consider the following:

Your equation can be trivially integrated if either $c_2=0$ or $c_3=0$, so we will assume that neither is the case. Then if we define $ h(x)=\frac{c_3}{c_2}\sqrt{f(x)} $ we may rewrite the equation as $ \phantom{(*)}\qquad\qquad\frac{d^2}{dx^2}\left[h(x)\right]^2-2\frac{d}{dx}h(x)-k=0\qquad\qquad(*) $ where $ k=\frac{c_1c_3}{c_2^2}. $ If we now assume that $h(x)$ is analytic at $x=0$ so that we may write $ h(x)=h_0+h_1x+h_2x^2+h_3x^3+\ldots $ then we can prove inductively, by repeatedly differentiating (*) and evaluating at $x=0$, that

  1. if $h_0=0$ then $h_2=h_3=h_4=\ldots=0$ and $\displaystyle h_1=\frac{1\pm\sqrt{1+2k}}{2}$,
  2. if $h_0\ne0$ and $h_2=0$ then $h_3=h_4\ldots=0$ and, again, $\displaystyle h_1=\frac{1\pm\sqrt{1+2k}}{2}$.

Therefore in either of these situations $h(x)$ is a polynomial of degree 1 and $f(x)$ is the square of this polynomial.

If $h_0\ne0$ then we may compute $h_2$, $h_3$, $h_4,\ldots$ iteratively in terms of $h_0$ and $h_1$: $ \begin{aligned} h_2&=\frac{2h_1^2-2h_1-k}{4h_0}\\ h_3&=\frac{(2h_1^2-2h_1-k)(3h_1-1)}{12h_0^2}=h_2\cdot\frac{3h_1-1}{3h_0}\\ h_4&=\frac{(2h_1^2-2h_1-k)(30h_1^2-20h_1+2-3k)}{96h_0^3}=h_2\cdot\frac{30h_1^2-20h_1+2-3k}{24h_0^2}\\ &\vdots \end{aligned} $ These computations are consistent with point (2) above.

  • 0
    I wouldn't call the series a "formal power series" since it was meant to be the series expansion for the solution. That the solution has such an expansion is an assumption that needs to be justified. I believe it can be, but I will have to think about that some more. I agree that the coefficients are messy. My point in writing them down is to show that when $h_0\ne0$ there's a one-parameter family of (i.e. infinitely many) solutions. In contrast, when $h_0=0$, there is a finite number,either zero, one, or two, of solutions.2012-05-29