I am currently thinking about how to a priori justify a power series ansatz to solve a nonlinear ODE of first order.
Let, for example, y'(x)=y^2-x^2 with $y(0)=1$. The right hand side of the ODE is smooth, thus any solution is smooth and, by Picard-Lindelöf's Theorem, locally unique. Yet it is not necessarily real analytic.
Suppose we do a power series ansatz, say $y(x)=\sum_{n=0}^{\infty} a_n x^n ~~ $ for $ x \in (-1,1)$. We calculate the Cauchy-Product to get $y^2$ and obtain a recursive formula to calculate the coefficients $a_n$ by 'coefficient comparison' (don't know if this is the english word).
The problem I have with that is, that we must assume $\sum_{n=0}^{\infty} a_n x^n$ to be absolutely convergent in $(-1,1)$ to merely write down the Cauchy-Product. Obviously the convergence of this series depends on it's coefficients; but we use the Cauchy-Product to calculate the coefficients...
Suppose we have calculated them and afterwards we prove that the series is absolutely convergent in $(-1,1)$. Also we prove that the series is a solution of our ODE. Now everything's fine...
But my question is: Is there a cleaner way to justify the power series ansatz? For example, can one show that solutions of ODE's of the above type are real analytic?