When does $\sqrt{x+\sqrt{x+1+\sqrt{x+2+...}}}=0$?
To address how I performed the computation, this was done in Mathematica using the following code:
F[x_, n_] := Fold[Sqrt[x + #1 + #2] &, 0, Reverse[Range[n + 1] - 1]]
FindRoot[F[x, 500] == 0, {x, -1.2}, WorkingPrecision -> 50]
The first command defines $f_n(x)$; the second chooses $n = 500$, which due to the extremely rapid convergence of $\{f_n\}_{n \ge 1}$, is more than sufficient to converge to the desired precision with a short computation time. You can check that the result is accurate by choosing $n = 100$ and seeing that the result is unchanged to $50$ digits of precision; indeed, even to $100$ digits of precision. I would put an upper bound on the error to be less than $10^{-n}$ when using $f_n$ instead of $f$.
Not really a solution, but here's a different different perspective that was too long for a comment:
We have $$ f(x) = \sqrt{x+f(x+1)} $$ or equivalently $$ f(x)^2 =x+f(x+1) $$ which gives us a pseudo-recursion for $f'(x)$: $$ f'(x) = \frac{1+f'(x+1)}{2f(x)} $$ Just formally, we get:\begin{eqnarray} f'(x) = \frac{1}{2f(x)} + \frac1{4f(x)f(x+1)} +\frac1{8f(x)f(x+1)f(x+2)}+\cdots = \sum_{n=0}^\infty \frac1{2^{n+1}}\prod_{k=0}^n\frac1{f(x+k)} \end{eqnarray} Does this series converge? It seems certain that it ought to, and indeed since $f(x)\ge\sqrt{x}$, one gets the upper bound for each summand is $$ \frac1{2^{n+1}}\prod_{k=0}^n\frac1{\sqrt{x+k}} $$ which is clearly summable (and hence we also have uniform convergence on closed intervals when $x>0$, validating that this is indeed the derivative of $f$, not simply formally).
Furthermore, for the $m$th derivative when $m>1$, we have $$ \sum_{j=0}^m\binom{m}{j} f^{(j)}(x)f^{(m-j)}(x) = f^{(m)}(x+1) $$ which can be solved for $f^{(m)}$ by taking $$ f^{(m)}(x) =\frac{\sum_{j=1}^{m-1}\binom{m}{j} f^{(j)}(x) f^{(m-j)}(x) + f^{(m)}(x+1)}{2f(x)} $$ and similarly gives us (at least formally!)$$ f^{(m)}(x) =\sum_{n=0}^\infty \frac{\sum_{j=1}^{m-1}\binom{m}{j} f^{(j)}(x+n) f^{(m-j)}(x+n)}{2^{n+1}\prod_{k=0}^n f(x+n)} $$ One should similarly be able to show uniform convergence of these series.
Now, observe that the formula $$ f(x+1) = f(x)^2 - x $$ allows us to explicitly compute $f(x+n)$ in terms of $f(x)$ for all $n\in \mathbb{N}$. Say $f(x+n) = g_n(f(x),x)$. Then $f(x+n) = f(x+n-1)^2 - x-n = g_{n-1}(f(x),x)^2 - x - n$, so we have the following explicit recursion for $g_n$:$$ g_n(y,x) = \begin{cases}y & n=0\\ g_{n-1}(y)^2 - x - n & n>0 \end{cases} $$ Thus our derivative formula actually becomes a differential equation:$$ f'(x) = \sum_{n=0}^\infty \frac1{2^{n+1}}\prod_{k=0}^n\frac1{g_k(f(x),x)} $$ This is not extremely helpful, but it is mildly interesting! It will also imply that $f$ is the unique differentiable function satisfying $f(x)^2 = x+f(x+1)$ with its initial value $f(0)=\sqrt{1+\sqrt{2+\sqrt{3+\cdots}}}$.
Update: A sort-of solution. The root is $$ r = - \sqrt{1 + \sqrt{2 + \sqrt{3 + \sqrt{4 + \cdots} - \sqrt{ 1 + \cdots}} - \sqrt{1 + \sqrt{2 + \cdots} - \sqrt{1+\cdots}}} - \sqrt{1 + \sqrt{2 + \sqrt{3 + \cdots} - \sqrt{1 + \cdots}} - \sqrt{1 + \sqrt{2 + \cdots} - \sqrt{1+\cdots}}}} $$ In case it's not clear what the pattern is, any time you see $n+\cdots$ read it recursively as $$n+\cdots=n+\sqrt{n+1 + \cdots} - \sqrt{1+\cdots}$$
To see how this is a solution, one can see from this recursive reading: \begin{eqnarray} \sqrt{1+r + \sqrt{2+r + \cdots}} = \sqrt{1 - \sqrt{1 + \cdots} + \sqrt{2 +\cdots}} = -r \end{eqnarray} and hence $$ f(r) = \sqrt{r + \sqrt{1+r + \sqrt{2+r+\cdots}}} = \sqrt{r - r} = 0 $$ A recursive way to find this root is to observe, using $f(x)^2 = x + f(x+1)$, one has $f(r+1) + r = 0$ or equivalently, $r+1 = 1-f(r+1)$. Thus $r+1$ is the unique fixed point of $x\to 1 - f(x)$. Hence you have formally$$ r+1 = 1-f(1-f(1-\cdots)) $$ In fact the fixed point is attracting; one can show inductively that $f_n'(1) < 1$ for all $n$, and hence $f'(1)<1$. Using the fact that $r<-1$ and that $f'$ is decreasing, one has $$ f'(r+1) = \frac{1+f'(r+2)}{2f(r+1)} = \frac{1+f'(r+2)}{-2r} \le \frac{1+f'(1)}{-2r} < \frac{1}{-r} < 1 $$ Thus the derivative of $1-f(x)$ at $x=r+1$ is less than $1$ in absolute value, so iterating it does converge to its fixed point. Defining a sequence $c_n$ recursively by $$ c_{n} = 1 - f_n(c_{n-1}) $$ with $c_0 = 1$, one obtains a sequence that converges to $r+1$. The first few terms:\begin{eqnarray} c_0 &=& 1\\ c_1 &=& 0 \\ c_2 &=& 1-\sqrt[4]{1+\sqrt2}\\ c_3 &=& 1-\sqrt{1-\sqrt[4]{1+\sqrt2}+\sqrt{2-\sqrt[4]{1+\sqrt2}+\sqrt{3-\sqrt[4]{1+\sqrt2}+\sqrt{4-\sqrt[4]{1+\sqrt2}}}}} \end{eqnarray} clearly this gets unwieldy pretty fast. I programed this recursion into Haskell and it seems to converge pretty fast. After $c_{65}$ on my computer shows the values oscillating between $-0.21103728351247164$ and $-0.21103728351247142$, so I guess the root is $$ r\approx -1.211037283512471\dots $$ which agrees with the numerical calculations from other users.
This problem is equivalent to proving that there exists some real number $r$ such that \begin{equation} \lim_{n \to \infty} f_n(r) = 0 \end{equation} where $f_n = g_n^0(x)$ and $g^k_n$ is defined by the recurrence \begin{equation} g_n^k(x) = \begin{cases} \sqrt {x+k} & \text{if }n=1 \\ \sqrt{x + k + g^{k+1}_{n-1}(x)} & \text{o.w.} \\ \end{cases}. \end{equation} Therefore, by definition, we have \begin{equation} f_n(x) = \begin{cases} \sqrt {x} & \text{if }n=1 \\ \sqrt{x + g^{1}_{n-1}(x)} & \text{o.w.} \\ \end{cases}, \end{equation} which after proving the following: \begin{equation} g_n^1(x) = g_n^0(x+1) , \end{equation} by a simple induction, gives us that \begin{equation} f_n(x) = \begin{cases} \sqrt {x} & \text{if }n=1 \\ \sqrt{x + f_{n-1}(x+1)} & \text{o.w.} \\ \end{cases}. \end{equation}
Let us suppose that $r_n - f_{n-1}(r_n+1) = \epsilon_n$ and that $\lim_{n \to \infty} \epsilon_n = 0 $ then we have that \begin{equation} \lim_{n \to \infty} f_{n}(r_n) = 0. \end{equation}
One possible route to find this is to consider the growth of the sequence \begin{equation} \delta_n(x) = f_n(x+1) - f_n(x). \end{equation}
Therefore given a good approximation to $r_n$, an educated guess for $r_{n+1} $ should be some \begin{equation} r_{n+1} \in [r_{n} - 1, r_{n}], \end{equation} this is because \begin{equation} f_{n+1}^2(r_{n} - \epsilon) = r_{n} - \epsilon + f_n(r_{n} - \epsilon) +\delta_n(r_{n} - \epsilon) \end{equation} and \begin{equation} \delta_n(r_{n} - \epsilon) = f_n(r_{n} +1- \epsilon) - f_n(r_{n} - \epsilon) = f_n(r_{n} +(1-\epsilon)) - f_n(r_{n} - \epsilon) \end{equation} gives us that we should be looking for some \begin{equation} -\delta_n(r_{n} - \epsilon) \approx r_{n} - \epsilon + f_n(r_{n} - \epsilon) \end{equation} which is equivalent to saying \begin{equation} -\delta_n(r_{n} - \epsilon) - f_n(r_{n} - \epsilon) \approx r_{n} - \epsilon \end{equation} or \begin{equation} -f_n(r_{n} + (1- \epsilon)) \approx r_{n} - \epsilon, \end{equation} or \begin{equation} f_n(r_{n} + (1- \epsilon)) \approx \epsilon - r_n . \end{equation}
which is a "sort-of" fixed point equation. Now we are getting somewhere!
We now have a method to recursively approximate solutions from previous solutions; i.e. \begin{equation} f_n(r_{n} + (1- \epsilon)) = \epsilon -r_{n} \implies f_{n+1}(r_{n} - \epsilon) = 0 . \end{equation}
Let us compute $r_n$ for small values to verify the solution.
- $f_1(x) = \sqrt x $ is the base case and it has the obvious solution $r_1 = 0$.
- Since $f_1(0) = 0$ and $\sqrt{1- \frac{\sqrt5 -1}{2}} = \frac{\sqrt5 -1}{2} $ we have an exact solution in the form of $r_1 = 0 $, $\epsilon = \frac{\sqrt5 -1}{2} $, and $r_2 = r_1 - \epsilon $; i.e. $f_2 (r_1 - \epsilon) = f_2\left(\frac{ 1-\sqrt5 }{2}\right) = 0$.
- Let $\epsilon = \frac{1}{2} (3 - \sqrt{5})$ then one can easily check that both $f_2(r_2 +1 - \epsilon) = \epsilon - r_2 $ and $f_3(r_2 - \epsilon) =f_3(-1) = 0 . $
- Let $\epsilon = 0.153761 $ then one can easily check that both $f_2(r_3 + 1 - \epsilon) \approx \epsilon - r_3 $ and $f_4(r_3 - \epsilon) = f_3(-1.153761) \approx 0 . $
In conclusion, we have the following recursion relation for the roots
\begin{equation} r_{n+1} = \begin{cases} 0 & \text{if }n=1 \\ r_n - \epsilon & \text{if } f_n(r_n +1 - \epsilon) = r_n - \epsilon \\ \end{cases}; \end{equation}
however, it is possible that a recursive analysis of the following two functions
$f'_n(x)$
$\delta_n(x)$
could give a better recursive bound $b_n < b_{n-1}<1 $ on the intervals
\begin{equation} r_{n+1} \in [r_n-b, r_n] \end{equation}