Need specific example where Newton's method doesn't converge [duplicate]

Example for Case (A): $$f(x) = \frac{1}{1+x^2} - \frac{1}{2},$$ which has roots at $x \in \{-1,1\}$. The initial choice $x_0 = 2$ converges to the negative root.

Example for Case (B): $$f(x) = \begin{cases}\sqrt{x}, & x \ge 0 \\ -\sqrt{-x}, & x < 0 \end{cases}$$ has the peculiar property that for any initial guess $x_0 \ne 0$, the orbit is trapped in a cycle of period $2$, with $x_k = -x_{k-1}$. This is quite easy to prove and is left as an exercise for the reader.

Example for Case (C): $$f(x) = x^{1/3}.$$ The Newton's method recursion has no fixed point except for the initial guess $x_0 = 0$.


For $A$, suppose you have a function with multiple zeros like $f(x) = \sin x$. Depending on your starting point, you will see that you could converge to any of the roots ($x = n\pi$), even if you want it to converge to a particular $x = \pi$ for example - say you start with $x_0 = 7\pi / 4$

For $B$, imagine a function whose gradient is very sharp - almost vertical. Then there is a chance that the tangent will cut at a point beyond the root, and then when you repeat at the new point, it will come back to the original point (Something like $f(x) = \arctan x$ at $x = 0$)

For $C$, if your starting point is too close to a stationary point, the next point is very far away, and will keep pushing the point further and further from the root


(A) $f(x) = x^2-1$ and start near $x=-1$ when you 'want' $x=1$.

(B) $f(x) = \sqrt{1+x^2}$ and start at $x=1$.

(C) $f(x) = \arctan x$ and $x_0>2$ (needs a little work to show that the Newton iteration diverges).

(D) $f(x) = x^3-1$ with $x_0 = 0$.