How to derive a function to approximate $\sqrt{3}$?
Solution 1:
The reason that the iteration $x\leftarrow \tfrac{1}{2}(x+3/x)$ converges so rapidly to $\sqrt{3}$ is because it is derived from Newton's method. Newton's method says that to find a root of a function $f(x)$, we pick a starting point and repeat the iteration
$$x \leftarrow x - \frac{f(x)}{f'(x)}$$
until we have achieved our desired accuracy. As long as certain conditions on $f$ are satisfied (its derivative doesn't vanish, for example) and our initial guess is close enough to the root, this will always converge to the correct answer, and moreover it exhibits quadratic convergence, which means that you approximately double the number of decimal places of accuracy after each step of the iteration.
To approximate the square root of a number $c$, we need a function whose root is the square root of $c$. A simple choice is
$$f(x) = x^2 - c$$
The Newton iteration which finds the root of this equation is
$$x \leftarrow x - \frac{x^2 - c}{2x}$$
which you can rearrange to give
$$x \leftarrow \frac{1}{2} \left( x + \frac{c}{x}\right)$$
which, if you take $c=3$, is exactly the iteration you asked about in your question.
Solution 2:
Your suggestion to rely on the iteration of the function $f$ defined by $f(x)=2x-\sqrt3$ to compute $\sqrt3$ is not used in practice for at least two reasons. First it presupposes one is able to compute $\sqrt3$, in which case the approximation procedure is useless. Second, $\sqrt3$ is not an attractive point of $f$. This means that for every $x_0\ne\sqrt3$, the sequence $(x_n)$ defined recursively by $x_{n+1}=f(x_n)$ will not converge to $\sqrt3$, and in fact, in the present case $x_n\to+\infty$ or $x_n\to-\infty$ depending on whether $x_0>\sqrt3$ or $x_0<\sqrt3$.
By contrast, using $f(x)=\frac12\left(x+\frac3x\right)$ and starting, for example, from a positive rational $x_0$ produces a sequence $(x_n)$ of rational numbers whose computation does not require to know the value of $\sqrt3$ and which converges to $\sqrt3$. Additionally $(x_n)$ converges to $\sqrt3$ extremely fast since after a while, one step of the algorithm replaces the error by roughly its square, thus the rate of convergence is quadratic, in the sense that $x_{n+1}-\sqrt3$ is of the order of $(x_n-\sqrt3)^2$.
If you try to simulate the 20 first terms for $x_0=1$, for example, you will observe this spectacular speed of convergence which roughly says that each new iteration doubles up the number of significant digits of the approximant.
Solution 3:
Is there a general approach to find a function to approximate a constant?
For the square root case, sure. Bahman Kalantari has found a "basic family" of iteration functions for finding $\sqrt c$. They take the form
$$x_{k+1}=x_k-(x^2-c)\frac{\Delta_{m-2}(x_k)}{\Delta_{m-1}(x_k)}$$
where
$$\Delta_m(x)=\begin{vmatrix}2x&1&&\\x^2-c&\ddots&\ddots&\\&\ddots&&1\\&&x^2-c&2x\end{vmatrix}$$
is an $m\times m$ tridiagonal Toeplitz determinant, and $\Delta_0(x)=1$. $m$ here is the order of convergence of the iteration formula. $m=2$ for instance is the usual Newton-Raphson iteration (quadratic convergence), and $m=3$ is the Halley iteration (cubic convergence).
(Allow me two asides. First, one can evaluate $\Delta_m$ through a three-term recursion relation. Second, through a special property of Toeplitz determinants, it turns out that $\Delta_m(x)$ admits a (not too practical in this case) closed form: $\Delta_m(x)=(x^2-c)^{m/2}U_m\left(\dfrac{x}{\sqrt{x^2-c}}\right)$, where $U_m(x)$ is the Chebyshev polynomial of the second kind.)
Here are the first few members of the "basic family":
$$\begin{array}{c|l}m&\\\hline 2&x-\frac{x^2-c}{2x}\\3&x-(x^2-c)\frac{2x}{3x^2+c}\\4&x-(x^2-c)\frac{(3x^2+c)}{4x(x^2+c)}\\5&x-(x^2-c)\frac{4x(x^2+c)}{5x^4+10cx^2+c^2}\end{array}$$
See Kalantari's paper for the general formula when the function $x^2-c$ is replaced by some other function $f(x)$ whose root is sought.
This looks all rather theoretical and unwieldy, you might say. I'll put in a few Mathematica demonstrations for the first few members of the "basic family".
Here's the definition for $\Delta_n(x)$ (here called d[n]
):
d[0] = 1;
d[1] = 2 x;
d[n_Integer] := Det[SparseArray[{Band[{2, 1}] -> x^2 - c,
Band[{1, 1}] -> 2x, Band[{1, 2}] -> 1}, {n, n}]]
Generate the first few members of the "basic family":
Table[x - (x^2 - c) Simplify[d[n]/d[n + 1]], {n, 0, 4}]
{x - (-c + x^2)/(2*x), x - (2*x*(-c + x^2))/(c + 3*x^2),
x - ((-c + x^2)*(c + 3*x^2))/(4*c*x + 4*x^3),
x - (4*x*(-c + x^2)*(c + x^2))/(c^2 + 10*c*x^2 + 5*x^4),
x - ((-c + x^2)*(c^2 + 10*c*x^2 + 5*x^4))/(6*c^2*x + 20*c*x^3 + 6*x^5)}
Here's a demonstration of the Newton-Raphson iteration for approximating $\sqrt{10}$:
With[{c = 10}, FixedPointList[Function[x, x - (-c + x^2)/(2*x)],
N[5, 100]] - Sqrt[c]];
N[#2/#1 & @@@ Partition[Log[%], 2, 1], 20]
{-1.7838671038748854863, 3.7925879490759233441, 2.4492570735611599421,
2.1829174906843319475, 2.0837943631820556414, 2.0402123955504591656,
2.01970990649706825, Indeterminate, Indeterminate}
Note that successive ratios of the logarithms of the error approach the value $2$, demonstrating the quadratic convergence. (The Indeterminate
s in the last two slots come up because the error was effectively zero at that point.)
For comparison, here is the iteration with quartic (fourth-order) convergence:
With[{c = 10}, FixedPointList[Function[x, x - ((-c + x^2)*
(c + 3*x^2))/(4*c*x + 4*x^3)], N[5, 1000]] - Sqrt[10]];
N[#2/#1 & @@@ Partition[Log[%], 2, 1], 20]
{-6.7654728809088590549, 5.3465261050589774854, 4.2513830895422052676,
4.0591297194912047384, 4.0145670928443785744, Indeterminate,
Indeterminate}
Here the convergence was a bit too fast, but the approach to the value $4$ can still be seen.
Solution 4:
As others have already said, your first equation is derived from Newton's method. The idea of it is very simple, and it answers your last question. Basically, to approximate the location of a root of a function, we approximate the function locally by its tangent, find the tangents root instead, and that value is a decent approximation for the location of the desired root. It is a good exercise too use the idea to derive the formula for Newton's method.
However, we can view the equation produced for square roots in a more elementary light. If $x_n>0$ is a decent approximation for $\sqrt{n}$, then $$ \frac{ x_n + n/x_n }{2} $$ will be even better. Why? If $x_n $ is an approximation for $\sqrt{n}$ slightly too small (big), then $ n/x_n$ is an approximation for $\sqrt{n}$ slightly too big (small) - and then we average them to get somewhere closer in between. I still remember from all the calculations I used to do with this that $19601/13860$ is a good approximation for $\sqrt{2}$.