9
$\begingroup$

I am currently thinking about how to a priori justify a power series ansatz to solve a nonlinear ODE of first order.

Let, for example, y'(x)=y^2-x^2 with $y(0)=1$. The right hand side of the ODE is smooth, thus any solution is smooth and, by Picard-Lindelöf's Theorem, locally unique. Yet it is not necessarily real analytic.

Suppose we do a power series ansatz, say $y(x)=\sum_{n=0}^{\infty} a_n x^n ~~ $ for $ x \in (-1,1)$. We calculate the Cauchy-Product to get $y^2$ and obtain a recursive formula to calculate the coefficients $a_n$ by 'coefficient comparison' (don't know if this is the english word).

The problem I have with that is, that we must assume $\sum_{n=0}^{\infty} a_n x^n$ to be absolutely convergent in $(-1,1)$ to merely write down the Cauchy-Product. Obviously the convergence of this series depends on it's coefficients; but we use the Cauchy-Product to calculate the coefficients...

Suppose we have calculated them and afterwards we prove that the series is absolutely convergent in $(-1,1)$. Also we prove that the series is a solution of our ODE. Now everything's fine...

But my question is: Is there a cleaner way to justify the power series ansatz? For example, can one show that solutions of ODE's of the above type are real analytic?

  • 3
    Par$t$ of $t$he point of an "ansatz" is that you justify it a posteriori, not a priori. You make a guess, solve, then afterward see that it worked. (Or see that it didn't work. These failures are mostly not included in your textbook, however.)2013-01-12

1 Answers 1

3

Consider an initial value problem $y'=f(x,y), \quad y(0)=y_0\ ,$ where $f$ is analytic in a neighborhood of $(0,0)$. This means, e.g., that there is a $\rho>0$ such that $f(x,y)=\sum_{j,k\geq0} c_{jk} x^j y^k\qquad \bigl(|x|^2+|y|^2<\rho^2\bigr)\ .$ It is a basic fact of the theory of differential equations that this analytic initial value problem has a unique solution $y(x)=\phi(x)\quad \bigl(|x|<\rho'\bigr)\ ,$ valid in some neighborhood of $x=0$, and that this solution is itself analytic in a neighborhood of $x=0$: $\phi(x)=\sum_{k\geq 0} a_k x^k\quad\bigl(|x|<\rho'\bigr)$ (see, e.g., Coddington/Levinson, Theory of ordinary differential equations, Theorem 8.1.).

The only special thing about the "real case" is the fact that in this case all given data (the $c_{jk}$ and $y_0$) are real, and consequently the $a_k$ appearing in the solution $\phi$ are real as well.