Relationship between rate of convergence and order of convergence

What is the difference between rate of convergence and order of convergence? Have they any relationship to each other? For example could i have two sequences with the same rates of convergence but different orders of convergence, and vice versa?


The order of convergence is one of the primary ways to estimate the actual rate of convergence, the speed at which the errors go to zero. Typically the order of convergence measures the asymptotic behavior of convergence, often up to constants. For example, Newton's method is said to have quadratic convergence, so the method has order 2. However, the true rate of convergence depends on the problem, the initial value taken, etc, and is typically impossible to quantify exactly. The order simply estimates this rate in terms of polynomial behavior, typically.

The order of convergence doesn't tell you everything. A numerical integration scheme with step size $h$ could have cubic order of convergence, so the errors go as $O(h^3)$, but the true error could be $100000h^3 + \ldots$, which would mean that for many practical problems the rate of convergence is actually quite slow.