Suppose we have $z=f(x)$ with $f$ an infinite series. We want to find $f^{-1}(z)=x$. Newton proposed the following method (as described in Dunham):

First, we say $x=z+r$. We find $z=f(z+r)$, drop all terms quadratic or higher in $r$ to find $r = g(z)$. Then we drop any quadratic or higher terms of $z$ to find $r = a + bz$. We repeat, writing $x=z+(a+bz)+r'$ and so forth, finding $x=z+r+r'+r''+\dots$.

I greatly enjoy this method, as it saves one the work of, well, finding the actual inverse. But I wonder: will this method always work? Intuitively, it seems like there must be "poorly behaved" series for which $z+r+\dots$ does not converge to $x$.


It works if for simplicity $f(x)=0+x+O(x^2)$.

In fact you are constructing a sequence of power series $(g_k)_k$, where $g_1(x)=x$ so that $f(g_1(x))-x=O(x^2)$. Then you find $g_{k+1}$ from $g_k$ such that $g_{k+1}-g_k=O(x^{k+1})$ and $f(g_k(x))-x=O(x^{k+1})$. Therefore, the sequence $(g_k)_k$ converges to some $g$ in the sence of convergence we have for formal power series and also $f(g(x))=x$ in the sense of formal power series. Note that I explicitly say formal power series. However, if $f$ is analytic (has positive radius of convergences) then the analytic inverse is guaranteed to exist and of course the power series we find is precisely this analytic function.