Rate of convergence of Secant Method

If the Secant Method converges to $r$, $f'(r)\neq0$, and $f''(r)\neq0$ then we have the approximate error relationship

$$e_{i+1}\approx\left|\frac{f''(r)}{2f'(r)}\right| e_i e_{i-1}.$$

And we also have that

$$e_{i+1}\approx\left|\frac{f''(r)}{2f'(r)}\right|^{\alpha-1} e_i^\alpha.$$

I would like to show that the Newton's method is generally faster than the Secant method, so I think I can compare the rates of convergence. But the rate of convergences of them have different forms. For Newton's method, it is $e_{i+1}/e_i^2$, and for Secant method, it is $e_{i+1}/e_i^\alpha$. How do we compare them?


Solution 1:

Newton's method takes in the best case 2 function evaluations, of $f(x_n)$ and $f'(x_n)$, to reduce the error by an exponent of $2$, that is, $e_{n+1}\sim Ce_n^2$. Distributed this makes $\sqrt2=1.4..$ per function evaluation, like $e_{n+\frac12}=ce_n^{\sqrt2}$, $e_{n+1}=ce_{n+\frac12}^{\sqrt2}=c^{1+\sqrt2}e_n^2$. This assumes that the derivative evaluation is about as complex as the function evaluation, like in the polynomial case. The distributed exponent is even less if the derivative evaluation is more expensive, which is typical in the non-scalar case.

The secant method, in the case that it converges at all, takes one function evaluation per step and reduces the error by an exponent of $\phi=\alpha=\frac{\sqrt5+1}2=1.6..$

Obviously, the secant method converges faster. Newton might be a little more robust in achieving convergence.

In the scalar situation, bracketing methods like variants of Regula Falsi or Dekker's method sacrifice some of the speed of the secant method to keep an interval with a sign change, and guarantee its reduction by inserting an occasional bisection step or similar.