How fast does the sequence $y_t$ defined by $y_{t+1}=y_t(1-y_t)$ decay to zero?

Consider the recurrence $\displaystyle x_{n+1} = x_n(1-x_n)$ (sorry changed the notation).

If $\displaystyle y_n = \frac{1}{x_n}$ then we see that

$$y_{n+1} - y_n = \frac{1}{1-x_n}$$

and thus

$$ y_n = y_0 + \sum_{k=0}^{n-1} \frac{1}{1-x_k} $$

Since $\displaystyle 1 - x_n \to 1$ we have that $\displaystyle \frac{1}{1-x_n} \to 1$ and thus $\displaystyle \frac{1}{n} \sum_{k=0}^{n-1} \frac{1}{1-x_k} \to 1$

And thus $\displaystyle \frac{y_n}{n} \to 1$ i.e. $\displaystyle n x_n \to 1$.

Similar technique was used in the excellent answer by David Speyer here: Convergence of $\sqrt{n}x_{n}$ where $x_{n+1} = \sin(x_{n})$

Note that we get your sequence by considering the sequence $\displaystyle x_{n+1} = f(x_n)$ where $\displaystyle f(x) = x - x^2$ and applying the technique in that answer here.

See also: Limit of the sequence $nx_{n}$ where $x_{n+1} = \log (1 +x_{n})$

PS: One can get more accurate estimates by using the above ($\displaystyle x_n \sim \frac{1}{n}$) and passing it again through

$\displaystyle y_n = C + \sum_{k=1}^{n-1} \frac{1}{1-x_k} = C + n + \sum_{k=0}^{n-1} x_k + + \sum_{k=0}^{n-1} \mathcal{O}(x_k^2)$

I believe by doing this we get the estimate

$$y_n = n + \log n + \mathcal{O}(1)$$


This is nothing but a repitition of former two answers, but it may summerize some ideas that are used throughout.

One of the critical observation that is needed for us is the Stolz-Cesàro theorem, which is stated as follows:

Theorem (Stolz-Cesàro) If $(a_n)$ and $(b_n)$ are sequence of real numbers such that $b_n \nearrow +\infty$ and $$ \lim_{n\to\infty} \frac{\Delta a_n}{\Delta b_n} = \ell,$$ where $\Delta a_n := a_{n+1} - a_n$ is the forward difference operator. Then $$\lim_{n\to\infty} \frac{a_n}{b_n} = \ell .$$

Now, suppose there is $a > 0$ and $r > 1$ such that $f(x) = x - (a + o(1))x^{r}$ for $0 \leq x \ll 1$. Then for suitably small positive initial condition $x_0$, we find that the sequence $(x_n)$ defined by the recursive formula $x_{n+1} = f(x_n)$ becomes positive and decreasing. Then it converges to some nonnegative number, which will satisfy the relation $x = x - (a + o(1))x^{r}$, hence zero.

Then simple (and much simpler for $r = 2$, as in our case) estimation shows that $$\begin{eqnarray*}\frac{1}{x_{n+1}^{r-1}} - \frac{1}{x_{n}^{r-1}} & = & \frac{1 - \left( x_{n+1}/x_{n} \right)^{r-1}}{x_{n+1}^{r-1}} \\ & = & \frac{1 - \left( 1 - (a + o(1)) x_{n}^{r-1} \right)^{r-1}}{x_{n}^{r-1} \left( 1 - (a + o(1)) x_{n}^{r-1} \right)} \\ & = & \frac{(r-1)(a + o(1)) x_{n}^{r-1} + O(x_{n}^{2(r-1)})}{x_{n}^{r-1}(1 + o(1))} \\ & = & (r-1) a + o(1). \end{eqnarray*}$$ Thus by Stolz-Cesàro Theorem, we have $$ \lim_{n\to\infty} n x_{n}^{r-1} = \frac{1}{(r-1)a}.$$


For every $y_0$ in $(0,1)$, $ty_t\to1$.

Here is an elementary proof. First, consider instead the sequences $(y^{(a)}_{t})$ defined by $$ y^{(a)}_{t+1}=\frac{y^{(a)}_{t}}{1+ay^{(a)}_{t}} $$ for a given positive $a$. Then $$ y^{(a)}_{t}=\frac{y^{(a)}_{0}}{1+tay^{(a)}_{0}}, $$ hence $y^{(a)}_t\to0$ and $\lim ty^{(a)}_t=1/a$.

Now, $y(1-y)\le y/(1+y)$ for every nonnegative $y$ hence, if $y^{(1)}_0=y_0$ and $y_0$ is in $(0,1)$, then $y^{(1)}_{t}\ge y_t$ for every $t$. This proves that $y_t\le y_{0}/(1+ty_{0})$, hence $\limsup ty_t\le1$.

In particular, $y_t\to0$ hence, for every positive $x<1$, one can choose an index $t_x$ such that $y_{t_x}\le x$. Let $z_x=y_{t_x}$ and $a_x=1/(1-x)$. Then $y(1-y)\ge y/(1+a_xy)$ for every $y$ in $(0,x)$, hence the iterations $y_t\to y_{t+1}$ for $t\ge t_x$ are such that $$ y_t\ge \frac{z_x}{1+(t-t_x)a_xz_x}. $$ This implies that $\liminf ty_t\ge1/a_x=1-x$ for every positive $x<1$, hence $\liminf ty_t\ge1$ and the result stated above holds for every $y_0$ in $(0,1)$.


I think it is clear that the slowest way this sequence can decay is by taking $y_0 = \frac{1}{2}$. This is because $x -x^2$ take maximum value $\frac{1}{4}$ on $(0,1)$ and $x - x^2$ is increasing on $(0,\frac{1}{2})$. Hence, whatever value of $y_0$ you choose, you have $y_ 1 \leq \frac{1}{4}$, and then $y_{2} \leq \frac{3}{16}$, $y_{3} \leq \frac{39}{256}$, etc, and by choosing $y_0 = \frac{1}{2}$, you get the unique slowest decreasing sequence.

It is worth noting that the sequence $(y_t)$ is the sequence obtained by applying Newton's method for solving $f(y) = 0$, where $f(y) = e^{\frac{-1}{y}}$ (to extend $f$ to a continuous function on $[0,1]$, we have to define $f(0) =0$, and there is clearly no other value of $y \in [0,1]$ for which $f(y) =0$).