When does Newton-Raphson Converge/Diverge?

Is there an analytical way to know an interval where all points when used in Newton-Raphson will converge/diverge?

I am aware that Newton-Raphson is a special case of fixed point iteration, where:

$$ g(x) = x - \frac{f(x)}{f'(x)} $$

Also I've read that if $|f(x)\cdot f''(x)|/|f'(x)^2| \lt 1$, then convergence is assured. I am just not sure on how to use this fact? Could anyone give me some examples? Thanks.


Consider the solution of \begin{equation} f(x) = 0, \end{equation} where $f : \mathbb{R} \rightarrow \mathbb{R}$ is at least two times differentiable with continuous derivatives and has a single root $x=r$ of multiplicity $1$. This last assumption ensures \begin{equation} f'(r) \not = 0 \end{equation} which will be needed later. Let $x_n$ denote an approximation of $r$ obtained by any means necessary. Then by doing a Taylor expansion at $x=x_n$ we obtain \begin{equation} 0 = f(r) = f(x_n) + f'(x_n)(r-x_n) + \frac{f''(\xi)}{2}(r-x_n)^2 \end{equation} or equivalently \begin{equation} - f(x_n) = f'(x_n)(r-x_n) + \frac{f''(\xi)}{2}(r-x_n)^2 \end{equation} for at least one $\xi_n$ between $r$ and $x_n$. This allows us to express Newton's iteration as \begin{equation} x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} = x_n + \frac{f'(x_n)(r-x_n) + \frac{f''(\xi_n)}{2}(r-x_n)^2}{f'(x_n)} \end{equation} While obscure, this representation allows us to immediately conclude that \begin{equation} r- x_{n+1} = -\frac{f''(\xi_n)}{2 f'(x_n)}{(r-x_n)^2} \end{equation} This is the equation which you can use to show convergence of Newton's method. Let us define the error at the $n$th step as \begin{equation} e_n = r - x_n \end{equation} then we can write \begin{equation} e_{n+1} = - \frac{f''(\xi_n)}{2 f'(x_n)} e_n^2 \end{equation} Now since $f'(r) \not = 0$ we can find an interval $I = [r-\delta,r+\delta]$ surrounding the root and determine a constant $M > 0$ such that \begin{equation} \forall : x, y \in I \: : \: \left| \frac{f''(x)}{2 f'(y)} \right| \leq M. \end{equation} Here the continuity of $f'$ and $f''$ is critical. Then we can write \begin{equation} |e_{n+1}| \leq M |e_n|^2 = (M|e_n|) |e_n|. \end{equation} It follows that if $x_0 \in I$ is picked such that \begin{equation} M|e_n| \leq \rho < 1 \end{equation} then not only will the error decrease, but (and this is critical) $x_1$ will belong to $I$, allowing the argument to be repeated, leading to the (pessimistic) estimate \begin{equation} e_n \leq \rho^n |e_0| \end{equation} which nevertheless establishes (local) convergence of Newton's method.

As the iteration converges, it will sooner rather than later do so quadratically, as \begin{equation} \frac{e_{n+1}}{e_n^2} = - \frac{f''(\xi_n)}{2 f'(x_n)} \rightarrow -\frac{f''(r)}{2 f'(r)}, \quad n\rightarrow \infty, \quad n \in \mathbb{N}. \end{equation} Here it is critical that Taylor's theorem ensure that $\xi_n$ is betweeen $x_n$ and $r$. Since $x_n \rightarrow r$ the squeeze lemma will ensure that $\xi_n \rightarrow r$ as $n \rightarrow \infty$.


A theoretically nice but practically nearly useless answer is provided by the Newton-Kantorovich theorem: If $L=M_2$ is an upper bound for the magnitude of the second derivative over some interval $I$, and with $x_0\in I$ and the first step $s_0=-\frac{f(x_0)}{f'(x_0)}$ the "ball" $B(x_0+s_0,|s_0|)=(x_0+s_0-|s_0|,x_0+s_0+|s_0|)$ is contained in $I$ and $$ L·|f'(x_0)^{-1}|^2·|f(x_0)|\le\frac12 $$ then there is a unique root inside that ball and Newton's method converges towards it.