What is the most efficient algorithm to find the closest prime less than a given number $n$

Solution 1:

You mention $2^{63}.$ It's known that below $4\cdot10^{18}\approx2^{61.8},$ there is at least one prime every 1476 numbers. So a reasonable algorithm would be to sieve some numbers smaller than n (up to 2000 if you want to be on the safe side) up to some small bound, then check the remaining numbers with the Miller-Rabin test. Test any that pass with a primality-proving algorithm—do not use AKS as other answers have suggested; it is not fast in this range.

Of course 2000 is overkill; an interval of length 200 should be good enough for 99.9997% of cases. But sieving is fast so it's not really obvious what length is optimal. In any case the more important optimization issue is deciding how far to sieve rather than how long of an interval to sieve.

As for the primality-proving algorithm, ECPP works. Recently it was proven that there are no BPSW-pseudoprimes less than $2^{64},$ so a Lucas test following your MR test (to base 2) is even faster.

Solution 2:

This is an extension of Bill's comment.

It is certainly reasonable to use strong probabilistic tests and then verify with the AKS algorithm. In fact, if we look for primes alternately above and below the target number and test with AKS, where each primality test takes time at most $\log^{O(1)}N$, and there is an approximately $\dfrac{1}{\log N}$ chance that each is prime, one would be at least 99% sure to find a prime in time $O(\log^{O(1)}N)$. But this is not deterministic - conceivably, there exists an interval where this works very poorly, far worse than logarithmic time. To be fair, I suspect such an interval does not exist in the numbers up to $2^{64}$. But speaking generally, a deterministic algorithm might be desired.

To that end, I refer you to the results of Polymath4 "Finding Primes," a collaborative mathematical experience that sought to find a method to deterministically find primes of about a given size. In their paper, they describe deterministic methods that take time $O(N^{.5 + \epsilon})$, which I think is pretty exciting.

Ultimately, I would say that Bill's suggestion of probabilistically checking numbers and then verifying their primality is optimal. But the promise of a deterministic algorithm is... quite alluring, I think, too.

Solution 3:

Use sieving to optimize your loop even more.

Guess an interval of reasonable length on which your prime is going to be found. Initialize a corresponding table with entries all equal to "true".

Go over a list of small primes $p_i$, and compute $n \mod p_i$. Using that value, you can mark off some of the entries in the table as "false".

Now do what Bill suggests - go over the remaining entries, use Miller-Rabin and then if you want to be really sure, some deterministic test (not necessarily AKS, which is slow).