What is the fastest growing total computable function you can describe in a few lines?

What is the fastest growing total computable function you can describe in a few lines? Well, not necessarily the fastest - I just would like to know how far an ingenious mathematician can go using only a few lines, and what systematic approaches exist for this purpose.

How farther you can go if we the restriction of computability is lifted?


Most fast-growing functions are constructed by a recursive scheme analogous to the well-known Ackermann function - namely one repeatedly iterates a function and then diagonalizes. Thus iterating addition yields multiplication, which iterated yields exponentiation, which iterated yields tetration, etc. Diagonalizing the resulting sequence of functions yields a faster-growing Ackermann function - which itself may be successively iterated, diagonalized, etc. Paul du Bois-Reymond discovered such diagonalization on growth rates of functions ("orders of infinity") in 1875, long before Cantor's better-known rediscovery (1891), employed to show $|2^S| > |S|$.

Proof-theorists use analogous recursive schemes to construct notation systems for ordinals - which they employ to measure the strength of logical systems. Functions and numbers notated by such means (whether finite or infinite) tremendously dwarf those that occur in most all other branches of mathematics. Thus I highly recommend that you consult the literature on ordinal notation systems.

Below are some expository references on related topics (from an old sci.math post).

[Ruc] Rucker, Rudy. Infinity and The Mind, 1995, 342 pp. Princeton U. Pr.

[Spe] Spencer, Joel. Large numbers and unprovable theorems, Amer. Math. Monthly, Dec 1983, 669-675.

[Smo] Smorynski, Craig (all three papers are reprinted in [HFR])
Some rapidly growing functions, Math. Intelligencer, 2 1980, 149-154.
The Varieties of Arboreal Experience, Math. Intelligencer, 4 1982, 182-188.
"Big" News from Archimedes to Friedman, Notices Amer. Math. Soc. 30 1983, 251-256.

[Kol] Kolata, Gina. Does Godel's theorem matter to mathematics? Science 218 11/19/1982, 779-780; reprinted in [HFR]

[Sim] Simpson, Stephen G. Unprovable theorems and fast-growing functions, Contemp. Math. 65 1987, 359-394.

[HFR] Harrington, L.A. et.al. (editors) Harvey Friedman's Research on the Foundations of Mathematics, Elsevier 1985.


For a nice fast-growing function, consider the following:

Given rooted trees $S$ and $T$ whose vertices are labeled from $\{1,2,\cdots,k\}$, define a gap embedding as a label preserving embedding $h$ from $S$ into $T$ that satisfies the following: given $u$, $v$ vertices of $S$ such that $v$ is the child of $u$. For any vertex $w$ of $T$ that lies in between $h(u)$ and $h(v)$, $l(w) \ge l(h(v))$.

Given this definition, define $ETree(k)$ to be the longest sequence $T_i$ of rooted trees labeled from $\{1,2,\cdots,k\}$ such that $T_i$ has no more than $i$ vertices, and for no $i < j$ is there a gap embedding from $T_i$ into $T_j$.

The above construction is due to Harvey Friedman. He showed that no such sequence could be infinite (i.e. that rooted labeled trees are well-quasi-ordered under gap embedding), and it follows from Koenig's Tree Lemma that there is a longest such sequence.

The function $ETree(k)$ grows extremely fast. It grows at the rate of $F_{\psi_{\Omega_1} (\Omega_\omega)} (k)$. (Look up the fast-growing hierarchy.)

One can also get extremely fast growing functions using ordinal hierarchies. Let $I(a)$ be the $a$'th weakly inacessible cardinal. Consider the following: (here $a,b,c,d,e$ are ordinals, $\phi$ is the Veblen function, $C_n(a,b)$ are sets of ordinals, $\psi$ and $f$ are functions on ordinals.)

$C_0(a,b) = b \cup {0}$

$C_{n+1}(a,b) = \{c+d, \phi(c,d), \aleph_c, I(c), \psi(e,d), f(e,c,d) | c,d,e \in C_n(a,b), e < a\}$

$C(a,b) = \cup C_n(a,b)$

$f(a,n,b) =$ largest finite ordinal in $C_n(a,b)$

$\psi(a,b) = b$'th ordinal such that $b \notin C(a,b)$

Then, for example, $f(I(I(I(I(0)))),x,x)$ is an extremely fast growing function, much faster than $ETree$. To help understand this construction, see "Ordinal collapsing function" on wikipedia. If that is still confusing, look at

http://math.ucr.edu/home/baez/week236.html

for an introduction to ordinals, and

http://groups.google.com/group/sci.math/browse_thread/thread/b4b2769c6359b3e9/5317b10cbf60196

for a description of how ordinals can be used to define large numbers.

EDIT: Perhaps some more explanation is needed. $C(a,b)$ is the smallest set of ordinals containing all ordinals less than $b$ and $0$ and closed under the operations listed in the second line. The definition may appear circular, but it is well-defined by induction on $a$; given $f(c,n,b)$ and $\psi(c,b)$ for $c < a$, we define $C_n(a,b)$, and given $C_n(a,b)$, we define $f(a,n,b)$ and $\psi(a,b)$.

Some analysis of $f(a,n,b)$:

$f(0,0,b)$ is the largest finite ordinal in $C_0(0,b) = b$, which is $b-1$.

$f(0,1,b)$ is $2(b-1)$ if $b \ge 2$, otherwise it is $\phi(0,0) = 1$.

$f(0,n,b)$ is $2^{b-1}$ if $b \ge 2$, otherwise it is $2^{n-1}$

$f(1,0,b)$ is again $b-1$ if $b \ge 1$, $0$ if $b = 0$

$f(1,1,b)$ is $2^{b-1} (b-1)$ if $b \ge 2$ otherwise it is $1$

$f(1,n,b) > Tower_n (b-1) > 2\uparrow\uparrow n$ for $b \ge 2$

$f(2,n,b) > 2\uparrow\uparrow\uparrow n$ for $b \ge 2$

$f(m,n,b) > 2 \uparrow^{m+1} n$ for $b \ge 2$

$f(\omega,0,b) = b-1$ for $b \ge 2$

$f(\omega,1,b) = f(b-1,b-1,b-1) \ge 2 \uparrow^{b}(b-1)$ for $b \ge 2$

$f(\omega,2,b) = f(f(\omega,1,b),f(\omega,1,b),f(\omega,1,b)) \ge 2 \uparrow^{2 \uparrow^b (b-1)}(2 \uparrow^b (b-1))$

$f(\omega,n,b)$ is the function $f(x,x,x)$ applied $n$ times starting from $b-1$, which is greater than $F_{\omega+1}(n)$ where $F$ is the fast-growing hierarchy.

$F(\omega+1,n,b)$ is the function $f(\omega,x,x)$ applied $n$ times starting from $b-1$, which is greater than $F_{\omega+2}(n)$.

Each time we add $1$ to $a$, we go to a function which iterates the previous one $n$ times; each time we increase a to a limit ordinal, we diagonalize over previous functions. So the task becomes defining very large countable ordinals.


I do not know whether it is the fastest, but historically the Ackermann function was the first (afaik):

$\qquad A(m, n) =\begin{cases}n+1 & \mbox{if } m = 0 \\A(m-1, 1) & \mbox{if } m > 0 \mbox{ and } n = 0 \\A(m-1, A(m, n-1)) & \mbox{if } m > 0 \mbox{ and } n > 0.\end{cases}$

It generalises the idea that $+$, $\cdot$, $\exp$ and so on form a natural series of operators; $A$ excedes them all by controlling the stacking level in the second parameter.

It has been proven that $A$ (or rather $A(n,n)$) is not primitive recursive by showing that it grows faster than any primitive recursive function, including all exponential functions.


The fast growing hierarchy (FGH) is a personal favorite of mine because I think it is not only simple compared to things involving trees or the BB function, while still being able to outgrow, say, the Ackermann function.


First, a simple layman's explanation of what FAIL approximately does:

The lowest level is basic addition:

$Fn\#0=n+1=\text{addition}$

The next level is repeated application of the previous level:

$Fn\#1=F(F(F(\dots)\#0)n\#0)\#0=n\underbrace{+1+1+\dots+1}_n=n+n=2n=\text{multiplication}$

$Fn\#2=F(F(F(\dots)\#1)n\#1)\#1=\underbrace{2\cdot2\cdot\ldots\cdot2}_n\cdot n\approx2^n=\text{exponentiation}$

You can imagine this keeps going, and so I'm going to make a table of stuff:

$$\begin{array}{c|c}k&Fn\#k\approx\\\hline0&n+1\\1&2n\\2&2^n\\3&\underbrace{2^{2^{2^{\dots}}}}_n\\4&\text{Oh God help us}\\5&\text{can't really explain this well anymore}\\\vdots&\vdots\end{array}$$

Where each level is just doing the previous level $n$ times...

But... there is a function that grows faster than every function in the above list. How? Well, it diagonalizes along the list. This function is given by...

$Fn\#0\#1$

At $n=1$, it is equivalent to the $k=1$ level with $n=1$ inside. At $n=2$, it is equivalent to the $k=2$ level with $n=2$ inside.

Eventually, it will catch up to any function in the above list... and then go beyond it. This is the smallest non-primitive-recursive function in FAIL.

We can then nest the function over itself:

$Fn\#1\#1=F(F(\dots)\#0\#1)\#0\#1$

$Fn\#2\#1=F(F(\dots)\#1\#1)\#1\#1$

And eventually, we get a list. We then make a function that grows even faster by making it go from one function in the list to the next, so that it will always grow faster than any fixed level:

$Fn\#0\#2=Fn\#n\#1$

And then iterate over this function to get another list, then diagonalize over it, again and again until you come up with this list:

$$\begin{array}{c|c}k&Fn\#0\#k\approx\\\hline0&n+1\\1&Fn\#0\#1\\2&Fn\#0\#2\\3&Fn\#0\#3\\4&Fn\#0\#4\\5&Fn\#0\#5\\\vdots&\vdots\end{array}$$

Note I did not write most of these out explicitly because they are sooo large...

Anyways, we then diagonalize over this to end up with...

$Fn\#0\#0\#1$

And so on with the story. FAIL actually goes beyond this, extending the notation through @ symbols, and it gets pretty hairy, but to get you thinking, consider the following list:

$$\begin{array}{c|c}k&\text{result}\\\hline1&Fn\#1\\2&Fn\#0\#1\\3&Fn\#0\#0\#1\\\vdots&\vdots\end{array}$$

Have fun diagonalizing this, then iterating it over itself and diagonalizing some more...


The following does, however, require a small bit of comfort level with ordinals, though you needn't have heard of the term before reading this.

FGH is defined by the simple recursive rules:

$f_0(n)=n+1$

$f_{\alpha+1}(n)=\underbrace{f_\alpha(f_\alpha(\dots f_\alpha(n)\dots))}_{n\text{ iterations of }f_\alpha}$

$f_\alpha(n)=f_{\alpha[n]}(n)$

Let's start with some basic examples:

$f_0(5)=5+1=6$

Simple enough.

$f_1(5)=f_0(f_0(f_0(f_0(f_0(5)))))=f_0(f_0(f_0(f_0(6))))=f_0(f_0(f_0(7)))=\dots=10$

Still relatively simple...

$f_3(2)=f_2(f_2(2))=f_2(f_1(f_1(2)))=\dots=2^{11}=2048$

Ok, so if you fill in the $\dots$ part of the previous example, you'll start to notice that it takes quite a while to reach $f_0(n)$ to be able to apply rule one.

You will also notice that in terms of Knuth's up-arrow notation,

$f_k(n)\approx n\uparrow^{k-1}n$

Which so far, isn't too much. We then, however, reach the next stage, where rule 3 comes into play:

Let us define $\omega[n]=n$. It thus follows that

$f_\omega(2)=f_{\omega[2]}(2)=f_2(2)$

Relatively small huh? How about if we try $\omega+1$?

$f_{\omega+1}(2)=f_\omega(f_\omega(2))=f_\omega(f_2(2))=f_\omega(8)=f_8(8)$

And now it got significantly bigger, just at $n=2$. We then have

$f_{\omega\cdot2}(2)=f_{\omega+\omega}(2)=f_{\omega+2}(2)$

In general,

$\omega\cdot(k+1)[n]=\omega\cdot k+n$

And further,

$\omega^{k+1}[n]=\omega^k\cdot n$

To see how $\omega^3$ is expanded, as an example, see Calculation of $f_{\omega^3}(2)$ in the fast growing hierarchy.

What's worse is we can take this to greater heights:

$$\omega^\omega,\omega^{\omega^{\omega^{\dots}}},\text{and beyond}\dots$$

And the unique part is we can define this to go as high as we want. It's all about how far you dare to define the ordinals.

A nice series that not only explains this, but explains how to expand higher and higher ordinals.

Extremely large numbers

Simply Beautiful Art's guide to large numbers, the fast growing hierarchy, and ordinal collapsing functions.


After having written quite a few programs related to large numbers and recursion over on ppcg.SE, I came up with this really interesting notation which I now call Simply's Ordinal Array Program (SOAP), and I'll explain a tweaked version that's nicer on the eyes than what my actual program does.

Before we begin, we have 2 different types of things we'll deal with.

  • Integers : $\dots,-2,-1,0,1,2,\dots$

  • Ordered sets : $\{1,2,3\}$ or $\{x,y\}$ or $\{-2\}$

Next, we have our ordinal array function:

$$\small f(a,n,b,q)=\begin{cases}q,&a\text{ is a negative integer}\\a-1,&a\text{ is a non-negative integer}\\x,&a=\{x,y,z\},\text{ where $y$ or $z$ are zero}\\\{\{x,y,f(z,n,b,q)\},f(y,n,b,q),x\},&a=\{x,y,z\},\text{ but $y$ and $z$ aren't zero}\\n,&a=\{x,y\},\text{ where $x=0$}\\\{\{f(x,n,x,n),0\},n,n\},&a=\{x,y\},\text{ where $x\ne0$ and $y=0$}\\\{t,n,\{f(y,n,b,t)\}\},&a=\{x,y\},\text{ where $x,y\ne0$ and }t=\{f(x,n,x,n),y\}\\1,&a=\{x\},n=0\\\{f(b,n-1,b,n-1),x\},&a=\{x\}\end{cases}$$

And finally we have our much much simpler ordinal counting function:

$$g(a,n)=\begin{cases}n,&a=0\\g(f(a,n,a,n),n+1),&a\ne0\end{cases}$$

Perhaps a bit of a stretch of what you might call a few lines, but this notation has been well crafted to give you some very very very big numbers, and is strong enough to rival even Deedlit's answer. In fact, $g(\{\{-1,3,1\}\},3n)$ grows approximately as fast as the largest number you can write in Deedlit's $f$ notation using at most $n$ symbols.

Some basic analysis:

If $a$ is an integer, then $g(a,n)=g(a-1,n+1)=\dots=n+a$.

If $a=\{0,0\}$, then $g(a,n)=g(n,n+1)=2n+1$.

If $a=\{\{0,0\},0,k\}$, then $g(a,n)=g(\{0,0\},n+1)=2n+3$.

If $a=\{\{0,0\},1,k\}$, then $g(a,n)=g(\{\{0,0\},1,k-1\},n+2)=\dots=g(\{0,0\},n+2k+1)=2n+4k+3$.

If $a=\{\{0,0\},2,k\}$, then $g(a,n)\approx n\times2^k$.

Okay, not very fast growing so far, but suddenly,

If $a=\{\{0,0\},3,k\}$, then $g(a,n)\approx n\uparrow^kn\approx A(n,n)$, in terms of Knuth's up-arrow notation and the Ackermann function. This is much faster than before.

If $a=\{\{0,0\},3,\{\{0,0\},0,0\}\}$, then $g(a,n)\approx g_n$, with $g_{64}$ being the famous Graham's number.

If $a=\{\{0,0\},4,k\}$, then $g(a,n)\approx f_{\omega^k}(n)$ in the fast growing hierarchy, which is decently fast.

If $a=\{\{0,0\},5,\{0,0\}\}$, then $g(a,n)\approx f_{\varepsilon_0}(2n)$. This is large enough for Peano arithmetic to break down.

If $a=\{\{\{0\},3,\{0,0\}\},2,\{0,0\}\}$, then $g(a,n)\approx TREE(n)$, the famous TREE sequence.

From there, it just keeps going up very rapidly, and

$$g(\{x_n\},n)\approx f_{\psi(\chi(\Omega_{M+1}^{\Omega_{M+1}}))}(n)$$

where

$$x_n=\begin{cases}1,&n=0\\\{x_{n-1},x_{n-1},x_{n-1}\},&n>0\end{cases}$$

which is about as fast as my notation can grow.