Necessary or Sufficient conditions for a recursively defined sequence to be convergent?

Monotone convergence theorem gives one of the sufficient conditions for a bounded sequence to be convergent. In most of the examples/ exercises/ exam problems on this theorem, students are given a recursively defined sequence, say of real numbers, and they are supposed to show it is bounded, and monotone. Imagine this recursive relation is $x_{n+1}=f(x_n, n)$ and $x_0=a,$ where $f$ is a polynomial or a rational function or at most an algebraic function. The recipe to solve most of these types of exercises is, obtain the limit of the sequence by solving the equation $l=\lim_{n\to\infty} f(l, n)$ for $l,$ and then show $x_n$ is bounded and monotone. Most of the time showing boundedness is easy and, it involves standard inequalities (such as AM-GM inequality, Cauchy–Schwarz inequality) or mathematical induction. We can find many examples of this process on this site.

Here my question is "what are the necessary or sufficient conditions for $f$ to define a convergent sequence?"


The case where $\,f(x,n)=g(x)\,$ does not depend on $\,n\,$ reduces to a fixed-point problem, and has been previously discussed on the site, see for example 1, 2, 3, 4.

The following are a few sufficient conditions for the general $\,f(x,n)\,$.

  • $x_{n+1}-x_n = f(x_n,n) - x_n\;$ so a sufficient condition for monotonicity is that $\,f(x,n)-x\,$ keeps the same sign over the domain of interest. If $\,f(x,n)-x \ge 0\,$ then $\,x_{n+1} \ge x_n\,$ so $\,x_n\,$ is an increasing sequence, otherwise if $\,f(x,n) - x \le 0\,$ then $\,x_n\,$ is decreasing (example).

    If $\,f\,$ is continuous, then a sufficient condition for it to keep the same sign is to never cross $\,0\,$ i.e. the equation $\,f(x,n) = x\,$ to have no solutions.

  • For positive sequences $\,x_n\,$ an entirely similar argument can be made about comparing $\,\frac{f(x,n)}{x}\,$ with $\,1\,$. This is in fact the same as the previous argument, applied to the sequence $\,\ln x_n\,$.

  • $\,x_{n+1}-x_n = f(x_n,n)-f(x_{n-1},n-1)\,$ so a sufficient condition for monotonicity is that $\,f(x,n)-f(y,n-1)\,$ has the same sign as $\,x-y\,$. In that case the differences $\,x_{n+1}-x_n\,$ between consecutive terms all have the same sign, so the direction of monotonicity is ultimately determined by $\,x_1-x_0\,$ (example).

    A sufficient condition for this to happen is for $\,f\,$ to be increasing in each of the variables. If $\,f\,$ is differentiable, the condition translates to $\frac{\partial f}{\partial x}f(x,n) \ge 0\,$, $\,\frac{\partial f}{\partial n}f(x,n) \ge 0\,$.

    If $\,f(x,n)=g(x)\,$ does not depend on $\,n\,$ then the condition reduces to $\,g\,$ being increasing. If $\,g\,$ is differentiable, then the condition translates to $\,g'(x) \ge 0\,$.

  • Same as above, but if $\,f(x,n)-f(y,n-1)\,$ and $\,x-y\,$ have opposite signs then the differences $\,x_{n+1}-x_n\,$ change sign at each step, so the subsequences of even and odd indexes are each monotonic, in opposite directions (example). In the special cases above, this is equivalent to $\frac{\partial f}{\partial x}f(x,n) \le 0\,$, $\,\frac{\partial f}{\partial n}f(x,n) \le 0\,$, or $\,g(x)\,$ decreasing, or $\,g'(x) \le 0\,$.