How hard is it to do arithmetic?

People in computing are often observed saying that a computation takes $\operatorname{O}(n^3\log n)$ steps or that it's NP-hard or that it's not computable, or that it's primitive recursive, etc. I took a course that included some of the stuff about the boundary between what's computable and what's not, along the lines of Hartley Rogers' book, but I know next to nothing about computational complexity involving things well within the boundaries of what is computable (a topic neglected by Rogers' book). But I'm curious about this question about arithmetic.

An example: Say I'm confronted with the problem of finding the prime factorization of $667$. Unless it's prime itself, it's divisible by a prime number no bigger than its own square root, and the largest such is $23$. After ruling out, by various means, divisibility by all primes less than $23$, I consider that one. Now $3$ and $7$ are complementary relative to $10$, so if I add $23$ to $667$ I reduce the problem to whether a two-digit number is divisible by $23$. $667+23=690$, so that two-digit number is $69$, and that's obviously divisible by $23$. Therefore $667=23\times\text{something}$. Since $69=23\times3$, we've got $690=23\times30$, and this is $23$ less than that, so $667=23\times29$, and $29$ is prime, so we're done. BUT one checks the result by doing the multiplication by other methods, and so I notice that halfway between $23$ and $29$ is $26$, so $23\times29=(26+3)(26-3)$. Factoring a difference of two squares is something everybody learned in 7th grade (except those who didn't....) and so it's $26^2-3^2$. I know that $26^2 = 676$, and $3^2 = 9$, and guess what $676 - 9$ is? So it works. And of course I can also do it via the distributive law, the way they taught you in third grade.

The question is: If one figures out in each instance what the most efficient thing to do is, among all the arithmetic shortcuts and techniques that might cross one's mind in doing this, would the ten or fifteen or so seconds that it takes you to do all of the above suddenly expand into more time than it would take if you just plodded through it by applying prescribed algorithms that you learned in childhood? Often within seconds after I do something like this, I think of two or three other shortcuts that would have been quicker. But if I stopped to evaluate all the possibilities, it would take me more time than just doing things like what I described above and not thinking about what alternatives might not have occurred to me. So the question is whether this informal observation can be made precise and proved.

Warning: One of the comments raises in my mind the question of whether this will be widely misunderstood. I am NOT asking whether using shortcuts that come to mind on the fly is quicker than using the algorithms you've been taught at your mother's knee. I already know that it is. Otherwise I couldn't have written what I did above. What I'm asking is this: Would it take more time to figure out which way is quickest than to use what comes to mind?

Another clarification: I said "ten or fifteen or so seconds". I suspect that doing all that I described above explicitly probably took me less than ten seconds, but actually eliminating the primes less than 23 before that probably took me more than twice that long.


Factorization and arithmetic are two wildly different subjects.

The complexity of arithmetic is reasonably well understood. You might think that arithmetic (say addition, subtraction, multiplication, division, and raising to a power) is trivial. But multiplying large numbers is non-trivial. The algorithm used in practice to multiply really large numbers (Schönhage-Strassen) was only invented in 1971, and recently (2007) an even better algorithm was suggested by Fürer, though his algorithm isn't faster in practice even for huge numbers.

It gets even worse for factorization. There are various "tricks" used to factor numbers, and they use non-obvious algebraic number theory. The two most wildly-used algorithms today are Lenstra's elliptic curve factorization and the number field sieve (the number fields in question are $\mathbb{Q}$ adjoined with the root of some high-degree polynomial).

Wikipedia has a nice summary of best known results. There are also books on the subject, check Wikipedia's page on computational number theory, or (for a different selection of topics), Algebraic Complexity Theory.


Would it take more time to figure out which way is quickest than to use what comes to mind?

You cannot solve your problem and not figure out which way is the quickest, since the quickest for each instance is given by your output. In your instance the quickest is to divide by 23.