Closed form for the sequence defined by $a_0=1$ and $a_{n+1} = a_n + a_n^{-1}$
Today, we had a math class, where we had to show, that $a_{100} > 14$ for
$$a_0 = 1;\qquad a_{n+1} = a_n + a_n^{-1}$$
Apart from this task, I asked myself: Is there a closed form for this sequence? Since I didn't find an answer by myself, can somebody tell me, whether such a closed form exists, and if yes what it is?
A closed form I doubt there is. But asymptotics are easy: $$ a_{n+1}^2=a_n^2+2+1/a_n^2, $$ hence, for every $n\ge1$, $$ a_n^2=2n+1+\sum_{k=0}^{n-1}\frac1{a_k^2}.\qquad\qquad\qquad\qquad (*) $$ This shows that $a_n^2\ge2n+2$ for every $n\ge1$, for example $a_{100}\ge\sqrt{202}>10\sqrt{2}>14$. In particular, $a_n\to+\infty$. Plugging this into $(*)$ yields $a_n^2=2n+1+o(n)$ hence $$ \sqrt{2n}\le a_n\le\sqrt{2n}+o(\sqrt{n}). $$ At this point, we know that $a_n^2\ge2n+2$ for every $n\ge1$. Using $(*)$ again, one sees that, for every $n\ge1$, $$ a_n^2\le2n+2+\sum_{k=1}^{n-1}\frac1{2+2k}\le2n+2+\frac12\log(n). $$ Which already shows that $$14.2<a_{100}<14.3$$ Plugging this upper bound of $a_n^2$ into $(*)$ would yield a refined lower bound of $a_n^2$. And one could then plug this refined lower bound into $(*)$ again to get a refined upper bound. And so on, back and forth between better and better upper bounds and better and better lower bounds. (No more asymptotics here.)
I agree, a closed form is very unlikely. As for more precise asymptotics, I think $a_n = \sqrt{2n} + 1/8\,{\frac {\sqrt {2}\ln \left( n \right) }{\sqrt {n}}}-{\frac {1}{ 128}}\,{\frac {\sqrt {2} \left( \ln \left( n \right) -2 \right) ^{2} + o(1)} {{n}^{3/2}}}$
To elaborate on how I got my answer: I started with @Did's $a(n) \approx \sqrt{2n}$ and looked for a next term. $a(n) = \sqrt{2n}$ would make $ a(n+1) - (a(n) + a(n)^{-1}) = \sqrt {2\,n+2}-\sqrt {2}\sqrt {n}-1/2\,{\frac {\sqrt {2}}{\sqrt {n}}} = - \frac{\sqrt{2}}{8} n^{-3/2} + O(n^{-5/2})$. With $a(n) = \sqrt{2n} + c n^{-1/2}$ I don't get a change in the $n^{-3/2}$ term, so I tried $a(n) = \sqrt{2n} + c \ln(n) n^{-1/2}$ and got $a(n+1) - (a(n) + a(n)^{-1}) = (-\frac{2}{\sqrt{8}} + c) n^{-3/2} + \ldots$. So to get rid of the $n^{-3/2}$ term I want $c = \frac{2}{\sqrt{8}}$. Then look at the leading term for $a(n) = \sqrt{2n} + \frac{2}{\sqrt{8}} \ln(n) n^{-1/2}$ and continue in that vein...
From my answer here: Given $a_{1}=1, \ a_{n+1}=a_{n}+\frac{1}{a_{n}}$, find $\lim \limits_{n\to\infty}\frac{a_{n}}{n}$
Reposting here, as it is kind of lost in that thread and this thread is more suitable for it.
Note: I have no clue if a closed form exists, but here is an asymptotic estimate...
I think we can show that $$\displaystyle a_{n}^2 \sim 2n + \dfrac{\log n}{2} - C$$ for some constant $\displaystyle C \gt 0$
By $\displaystyle x_n \sim y_n$ I mean $\displaystyle \lim_{n \to \infty} (x_n - y_n) = 0$
Consider $b_n = a_{n}^2 - 2n$
Then we have that $\displaystyle b_{n+1} = b_n + \dfrac{1}{b_n + 2n}$
Notice that $b_0 \gt 0$ and thus $\displaystyle b_n \gt 0$.
(Note that the other thread linked above starts with $a_1 = 1$ and not $a_0 = 1$.)
We can easily show that $b_n \lt 2 + \log n$, as
$b_{n+1} - b_n = \dfrac{1}{b_n + 2n} \lt \dfrac{1}{2n}$
Adding up gives us the easy upper bound. Note, even though we can give tighter bounds, this is sufficient for our purposes.
Now we have that, for sufficiently large $\displaystyle m,n$
$\displaystyle b_{m+1} - b_n = \sum_{k=n}^{m} \dfrac{1}{b_k + 2k}$
we have that
$\displaystyle \sum_{k=n}^{m} \dfrac{1}{2k} \gt b_{m+1} - b_n \gt \sum_{k=n}^{m} \dfrac{1}{2k}(1- \dfrac{b_k}{2k})$
(Here we used $\displaystyle \dfrac{1}{1+x} \gt \ \ 1-x, 1 \gt x \gt 0$)
Now Since $b_k \lt 2 + \log k$, we have that
$\displaystyle \sum_{k=n}^{m} \dfrac{1}{2k} \gt b_{m+1} - b_n \gt \sum_{k=n}^{m} \dfrac{1}{2k} - \sum_{k=n}^{m} \dfrac{2 + \log k }{4k^2}$
Using the fact that $\displaystyle H_m - H_n = \log(\dfrac{m+1}{n}) + O(\dfrac{1}{n}) + O(\dfrac{1}{n} - \dfrac{1}{m})$, where $\displaystyle H_n = \sum_{k=1}^{n} \dfrac{1}{k}$ is the $\displaystyle n^{th}$ harmonic number.
We see that,
if $c_n = b_n - \dfrac{\log n}{2}$, then
$\displaystyle O(\dfrac{1}{n} -\dfrac{1}{m}) + O(\dfrac{1}{n}) \gt c_{m+1} - c_n \gt O(\dfrac{1}{n} -\dfrac{1}{m}) + O(\dfrac{1}{n}) -\sum_{k=n}^{m} \dfrac{2 + \log k }{4k^2}$
Now $\displaystyle \sum_{k=1}^{\infty} \dfrac{2 + \log k}{k^2}$ is convergent and so by the Cauchy convergence criteria, we have that $\displaystyle c_n$ is convergent.
Thus the sequence $\displaystyle a_{n}^2 - 2n - \dfrac{\log n}{2}$ converges and hence, for some $\displaystyle C$ we have that
$$\displaystyle a_{n}^2 \sim 2n + \dfrac{\log n}{2} - C$$
or in other words
$$\displaystyle a_{n} \sim \sqrt{2n + \dfrac{\log n}{2} - C}$$
A quick (possibly incorrect) computer simulation seems to show a very slow convergence to $\displaystyle C = 1.47812676429749\dots$
Note: Didier suggested an alternate proof in the comments below, which might simpler.
Let us consider the functional formulation. Given $y(0)=1$, $y' = \frac{1}{y}$ yields $ y(x) = \sqrt{2x+1}$.
I am not saying $a(n)=y(n)$. Yet there is a link between the two approaches (finite difference).