The goal of the four fours puzzle is to represent each natural number using four copies of the digit $4$ and common mathematical symbols.

For example, $165=\left(\sqrt{4} + \sqrt{\sqrt{{\sqrt{4^{4!}}}}}\right) \div .4$.

If we remove the restriction on the number of fours, let $f(N)$ be the number of fours required to be able to represent all positive integers no greater than $N$. What is the asymptotic behaviour of $f(N)$? Can it be shown that $f(N) \sim r \log N$ for some $r$?

To be specific, let’s restrict the operations to the following:

  • addition: $x+y$
  • subtraction: $x-y$
  • multiplication: $x\times y$
  • division: $x\div y$
  • exponentiation: $y^x$
  • roots: $\sqrt[x]{y}$
  • square root: $\sqrt{x}$
  • factorial $n!$
  • decimal point: $.4$
  • recurring decimal: $. \overline 4$

It is easy to see that $f(N)$ is $O(\log N)$. For example, with four fours, numbers up to $102$ can be represented (see here for a tool for generating solutions), so, since $96 = 4\times4!$, we can use $6k-2$ fours in the form $(\dots((a_1\times 96+a_2)\times 96+a_3)\dots)\times96+a_k$ to represent every number up to $96^k$.

On the other hand, we can try to count the number of distinct expressions that can be made with $k$ fours. For example, if we (arbitrarily) permit factorial only to be applied to the digit $4$, and allow no more than two successive applications of the square root operation, we get $\frac{216^k}{18}C_{k-1}$ distinct expressions where $C_k$ is the $k$th Catalan number. (Of course, many of these expressions won’t represent a positive integer, many different expressions will represent the same number, and the positive integers generated won’t consist of a contiguous range from $1$ to some $N$.)

Using Stirling’s formula, for large $k$, this is approximately $\frac{864^k}{72k\sqrt{\pi k}}$. So for $f(N)$ to grow slower than $r\log N$, we’d need to remove the restrictions on the use of unary operations. (It is well-known that the use of logs enables any number to be represented with only four fours.)

Can this approach be extended to show that $f(N)$ is $\Omega(\log N)$? Or does unrestricted use of factorial and square roots mean that $f(N)$ is actually $o(\log N)$? Is the answer different if the use of $x\%$ (percentages) is also permitted?


I'm one of the authors of the paper referenced by David Bevan in his comment. The four-fours was one inspiration for that problem, although others have thought about it also. The specific version of the problem there looks at the minimum number of $1$s needed to represent $n$ where one is allowed only addition and multiplication but any number of parentheses. Call this $g(n)$. For example, $g(6) \le 5$, since $6=(1+1)(1+1+1)$, and it isn't hard to show that $g(6)=5$. Even in this limited version of the problem, the question is generally difficult even to get asymptotics.

In some sense most natural questions of asymptotic growth are somewhat contained in this question, since one can write any given $k$ as $1+1+1...+1$ $k$ times, and $1=k/k$. Thus starting with some $k$ other than $1$ (such as $k=4$), the asymptotics stay bounded within a constant factor, assuming that addition and division are allowed.

However, actually calculating this sort of thing for any set of operations is generally difficult. In the case of integer complexity one has a straightforward way of doing so, since if one calculates $g(i)$ for all $i < n$, calculating $g(n)$ is then doable. This doesn't apply when one has other operations generally, with division and substraction already making an algorithm difficult. In this case, one can make such an algorithm but exactly how to do so is more subtle. In fact, as long as one is restricted to binary operations this is doable (proof sketch: do what you did to look at all distinct expressions).

Adding in non-binary operations makes everything even tougher. Adding in square roots won't make things that much harder, nor will adding factorial by itself. The pair of them together makes calculating specific values much more difficult. My guess would be that even with factorial, square root and the four binary operations there are numbers which require arbitrarily large numbers of $1$s, but I also suspect that this would be extremely difficult to prove. Note that this is already substantially weaker than what you are asking- whether the order of growth is of $\log n$. Here though square roots probably don't alter things at all; in order for it to matter one needs to have a lot numbers of the form n^2^k with surprisingly low complexity. This seems unlikely.


You can get $103$ with five $4$s as $$\frac {\sqrt{\sqrt{\sqrt{4^{4!}}}}+4+\sqrt{.\overline4}}{\sqrt{.\overline4}}=103$$

For four $4$s, we have $\dfrac {44}{.\overline 4}+4=103$.

In fact, $113$ is the first number I can't get with four $4$s.


Go here to look at a YouTube video discussing the matter. It proves that for all $n\in\mathbb{Z}^+$,

$$\log_{\frac 12}(\log_4\sqrt [n]{4}) = n.$$

But notice that $\dfrac 12 = \dfrac 24 = \dfrac{\sqrt{4}}{{4}}$ so...

Edit: By $n^\text{th}$ root, I mean $n$ square roots. So for example, $\sqrt [2] {4} = \sqrt{\sqrt{4}}$ (because it is simple to write it this way).