Primality testing for 64 bit numbers

I think some version of the (deterministic) Miller-Rabin Test should do the trick. It is usually used as a probabilistic test for whopping big numbers, but it can be repurposed as a deterministic test for smaller fixed ranges.

The core of the Miller-Rabin test is the notion of a "probable prime" for some base $a$. Specifically, let $n$ be an odd prime, and write $n - 1 = d \times 2^s$ for some odd $d$. Then, it follows that $a^d \equiv 1 \pmod{n}$ or $a^{d 2^r} \equiv -1 \pmod{n}$ for some $0 \leq r < s$. (The wikipedia page has reasoning for this).

If $n$ is any number, and $a<n$, we could run the same test, and if that test passed we would call $n$ a strong pseudoprime for base $a$. The usual Miller-Rabin test is based on doing this for a lot of different (randomly chosen) $a$, and some number-theoretic argument saying that if $n$ is composite, the probability of not finding an $a$ demonstrating this is vanishingly small (after trying a lot of them).

However, if Wikipedia is to be believed (I haven't followed up the reference), then for $n < 2^{64}$ it is sufficient to test $a \in \{2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37\}$. Somehow there are no composite numbers below $2^{64}$ which are a strong pseudoprime for all these $a$.

The above test is very fast, and altogether prime testing would require at most $12 \times 64$ modular exponentiations (this will be well below 1ms). For smaller $n$, you could even use a smaller list of possible $a$'s.


oeis/A014233 says:

The primality of numbers $\lt 2^{64}$ can be determined by asserting strong pseudoprimality to all prime bases $\le 37$.

The reference is the recent paper Strong pseudoprimes to twelve prime bases by Sorenson and Webster.

For code, see Prime64 and also the primes programs in FreeBSD, especially spsp.c.