Suppose we have $z=f(x)$ with $f$ an infinite series. We want to find $f^{-1}(z)=x$. Newton proposed the following method (as described in Dunham):
First, we say $x=z+r$. We find $z=f(z+r)$, drop all terms quadratic or higher in $r$ to find $r = g(z)$. Then we drop any quadratic or higher terms of $z$ to find $r = a + bz$. We repeat, writing $x=z+(a+bz)+r'$ and so forth, finding $x=z+r+r'+r''+\dots$.
I greatly enjoy this method, as it saves one the work of, well, finding the actual inverse. But I wonder: will this method always work? Intuitively, it seems like there must be "poorly behaved" series for which $z+r+\dots$ does not converge to $x$.