9
$\begingroup$

I am currently thinking about how to a priori justify a power series ansatz to solve a nonlinear ODE of first order.

Let, for example, $y'(x)=y^2-x^2$ with $y(0)=1$. The right hand side of the ODE is smooth, thus any solution is smooth and, by Picard-Lindelöf's Theorem, locally unique. Yet it is not necessarily real analytic.

Suppose we do a power series ansatz, say $y(x)=\sum_{n=0}^{\infty} a_n x^n ~~ $ for $ x \in (-1,1)$. We calculate the Cauchy-Product to get $y^2$ and obtain a recursive formula to calculate the coefficients $a_n$ by 'coefficient comparison' (don't know if this is the english word).

The problem I have with that is, that we must assume $\sum_{n=0}^{\infty} a_n x^n$ to be absolutely convergent in $(-1,1)$ to merely write down the Cauchy-Product. Obviously the convergence of this series depends on it's coefficients; but we use the Cauchy-Product to calculate the coefficients...

Suppose we have calculated them and afterwards we prove that the series is absolutely convergent in $(-1,1)$. Also we prove that the series is a solution of our ODE. Now everything's fine...

But my question is: Is there a cleaner way to justify the power series ansatz? For example, can one show that solutions of ODE's of the above type are real analytic?

  • 2
    I would guess that if an ODE involves only _complex_-analytic functions, then its solutions must be _complex_-analytic. I don't know if this is true or not. In any case for such an ODE we can talk about its formal solutions over a ring of formal power series, where one doesn't need any notion of convergence to talk about the Cauchy product; these solutions are a type of "weak solution" and one hopes to upgrade them to real solutions one way or another.2011-05-11
  • 5
    if the RHS is analytic then the solutions are analytic, even for partial diff. eq. (for analytic initial condition) - see http://planetmath.org/encyclopedia/CauchyKovalevskayaTheorem.html Usually it is proved by estimating the power series (method of majorants).2011-05-11
  • 3
    Part of the point of an "ansatz" is that you justify it a posteriori, not a priori. You make a guess, solve, then afterward see that it worked. (Or see that it didn't work. These failures are mostly not included in your textbook, however.)2013-01-12

1 Answers 1

3

Consider an initial value problem $$y'=f(x,y), \quad y(0)=y_0\ ,$$ where $f$ is analytic in a neighborhood of $(0,0)$. This means, e.g., that there is a $\rho>0$ such that $$f(x,y)=\sum_{j,k\geq0} c_{jk} x^j y^k\qquad \bigl(|x|^2+|y|^2<\rho^2\bigr)\ .$$ It is a basic fact of the theory of differential equations that this analytic initial value problem has a unique solution $$y(x)=\phi(x)\quad \bigl(|x|<\rho'\bigr)\ ,$$ valid in some neighborhood of $x=0$, and that this solution is itself analytic in a neighborhood of $x=0$: $$\phi(x)=\sum_{k\geq 0} a_k x^k\quad\bigl(|x|<\rho'\bigr)$$ (see, e.g., Coddington/Levinson, Theory of ordinary differential equations, Theorem 8.1.).

The only special thing about the "real case" is the fact that in this case all given data (the $c_{jk}$ and $y_0$) are real, and consequently the $a_k$ appearing in the solution $\phi$ are real as well.