Solution 1:

This is not an answer.

I am skeptical about a possible closed form of the summation which would converge extremely fast. Looking at the ratio of sucessive terms for $4\leq n \leq 200$, we have (from a quick and dirty nonlinear regression for which $R^2 >0.999999$) $$\log \left(\frac{R_{n+1}}{R_n}\right) \sim \alpha - \beta\,n^\gamma$$ $$\begin{array}{clclclclc} \text{} & \text{Estimate} & \text{Standard Error} & \text{Confidence Interval} \\ \alpha & 12.6408 & 0.1425 & \{12.3598,12.9218\} \\ \beta & 14.3159 & 0.1265 & \{14.0663,14.5655\} \\ \gamma & 0.14844 & 0.0008 & \{0.14682,0.15007\} \\ \end{array}$$

Notice that $a_{10}=1.61\times 10^{-21}$, $a_{100}=7.87\times 10^{-522}$, $a_{1000}=3.14\times 10^{-9505}$. So, we do not need to add many terms for a more than decent approximation. Adding the first hundred terms $$R=1.08171353471975721634480208063188816421005944622618865601394019148\cdots$$ which is not recognized by inverse symbolic calculators. However, with an error of $1.35\times 10^{-14}$, $R$ is close to the positive root of the quadratic $7678 x^2-992 x-7911=0$. $$