9
$\begingroup$

Problem
Given a number n, $2 \leq n \leq 2^{63}$. $n$ could be prime itself. Find the a prime $p$ that is closet to $n$.

Using the fact that for all prime $p$, $p > 2$, $p$ is odd and $p$ is of the form $6k + 1$ or $6k + 5$, I wrote a for-loop checking from $n - 1$ to 2 to check if that number is prime. So instead of checking for all numbers I need to check for every odd of the two form $p \equiv 1 \pmod{k}$, and $p \equiv 5 \pmod{k}$. However, I wonder if there is a faster algorithm to solve this problem? i.e. some constraints that can restrict the range of numbers need to be checked? Any idea would be greatly appreciated.

  • 0
    **Define "most efficient."** Do you refer to the *asymptotic* time complexity of the algorithm as the input gets arbitrarily large? For *practical* usage, keep in mind [the following quote:](http://www.catb.org/esr/writings/taoup/html/ch01s06.html) "Fancy algorithms are slow when $n$ is small, and $n$ is usually small. Fancy algorithms have big constants. Until you know that $n$ is frequently going to be big, don't get fancy." —Rob Pike2017-01-04

3 Answers 3

3

Use sieving to optimize your loop even more.

Guess an interval of reasonable length on which your prime is going to be found. Initialize a corresponding table with entries all equal to "true".

Go over a list of small primes $p_i$, and compute $n \mod p_i$. Using that value, you can mark off some of the entries in the table as "false".

Now do what Bill suggests - go over the remaining entries, use Miller-Rabin and then if you want to be really sure, some deterministic test (not necessarily AKS, which is slow).

8

You mention $2^{63}.$ It's known that below $4\cdot10^{18}\approx2^{61.8},$ there is at least one prime every 1476 numbers. So a reasonable algorithm would be to sieve some numbers smaller than n (up to 2000 if you want to be on the safe side) up to some small bound, then check the remaining numbers with the Miller-Rabin test. Test any that pass with a primality-proving algorithm—do not use AKS as other answers have suggested; it is not fast in this range.

Of course 2000 is overkill; an interval of length 200 should be good enough for 99.9997% of cases. But sieving is fast so it's not really obvious what length is optimal. In any case the more important optimization issue is deciding how far to sieve rather than how long of an interval to sieve.

As for the primality-proving algorithm, ECPP works. Recently it was proven that there are no BPSW-pseudoprimes less than $2^{64},$ so a Lucas test following your MR test (to base 2) is even faster.

  • 0
    @saru95 I don't think so, not at that size. Faster to sieve out the easy composites and test the rest rather than doing billions of divisions.2016-06-11
5

This is an extension of Bill's comment.

It is certainly reasonable to use strong probabilistic tests and then verify with the AKS algorithm. In fact, if we look for primes alternately above and below the target number and test with AKS, where each primality test takes time at most $\log^{O(1)}N$, and there is an approximately $\dfrac{1}{\log N}$ chance that each is prime, one would be at least 99% sure to find a prime in time $O(\log^{O(1)}N)$. But this is not deterministic - conceivably, there exists an interval where this works very poorly, far worse than logarithmic time. To be fair, I suspect such an interval does not exist in the numbers up to $2^{64}$. But speaking generally, a deterministic algorithm might be desired.

To that end, I refer you to the results of Polymath4 "Finding Primes," a collaborative mathematical experience that sought to find a method to deterministically find primes of about a given size. In their paper, they describe deterministic methods that take time $O(N^{.5 + \epsilon})$, which I think is pretty exciting.

Ultimately, I would say that Bill's suggestion of probabilistically checking numbers and then verifying their primality is optimal. But the promise of a deterministic algorithm is... quite alluring, I think, too.

  • 0
    @mixedmath, you write "Whenever I find primes of a large size, I always just search using Miller-Rabin and AKS." Could I ask what "large" is, and what AKS implementation you use? I'm not aware of any that are practical beyond, say, 100 digits. All are horrendously slow compared to APR-CL or ECPP (or even BLS75 for < 80 digits).2015-05-26