views:

764

answers:

8

Possible Duplicate:
Efficient storage of prime numbers

Just an example of what I am looking for: I could represent every odd number with a bit e.g. for the given range of numbers (1, 10], starts at 3:

1110

The following dictionary can be squeezed more right? I could eleminate multiples of five with some work, but numbers that end with 1, 3, 7 or 9 must be there in the array of bits. Hope this would clarify what I want.

I am looking for the best algorithm, to check if a number is prime i.e. a boolean function:

bool isprime(number);

I would like to know the best algorithm to implement this functionality. Naturally, there would be a data structure I could query. I define the best algorithm, to be the algorithm that produces a data structure with lowest memory consumption for the range (1, N], where N is a constant.

+1  A: 

According to wikipedia, the Sieve of Eratosthenes has complexity O(n * (log n) * (log log n)) and requires O(n) memory - so it's a pretty good place to start if you aren't testing for especially large numbers.

matt b
Sorry I know my description is vague, I am not looking at SOE :) look at my comment @Michael
AraK
+2  A: 

There are many ways to do the primality test.

There isn't really a data structure for you to query. If you have lots of numbers to test, you should probably run a probabilistic test since those are faster, and then follow it up with a deterministic test to make sure the number is prime.

You should know that the math behind the fastest algorithms is not for the faint of heart.

Ben S
Thanks Ben, I didn't think of 'storage' when I wrote the post >_<
AraK
A: 

You could use large bitfields, maybe an array of 64-bit numbers?

Mongoose
A: 

Smallest memory? This isn't smallest, but is a step in the right direction.

class PrimeDictionary {
    BitArray bits;

    public PrimeDictionary(int n) {
        bits = new BitArray(n + 1);
        for (int i = 0; 2 * i + 3 <= n; i++) {
            bits.Set(i, CheckPrimality(2 * i + 3));
        }
    }

    public PrimeDictionary(IEnumerable<int> primes) {
        bits = new BitArray(primes.Max());
        foreach(var prime in primes.Where(p => p != 2)) {
            bits.Set((prime - 3) / 2, true);
        }
    }

    public bool IsPrime(int k) {
        if (k == 2) {
            return true;
        }
        if (k % 2 == 0) {
            return false;
        }
        return bits[(k - 3) / 2];
    }
}

Of course, you have to specify the definition of CheckPrimality.

Jason
+2  A: 

The best method, in my opinion, is to use what's gone before.

There are lists of the first N primes on the internet with N stretching up to at least fifty million. Download the files and use them, it's likely to be much faster than any other method you'll come up with.

If you want an actual algorithm for making your own primes, Wikipedia has all sorts of good stuff on primes here, including links to the various methods for doing it, and prime testing here, both probability-based and fast-deterministic methods.

There should be a concerted effort to find the first billion (or even more) primes and get them published on the net somewhere so people can stop doing this same job over and over and over and ... :-)

paxdiablo
+2  A: 

The fastest algorithm for general prime testing is AKS. The Wikipedia article describes it at lengths and links to the original paper.

If you want to find big numbers look into primes that have special forms like Mersenne primes.

The algorithm I usually implement (easy to understand and code) is as follows (in Python):

def isprime(n):
   """Returns True if n is prime"""
   if n == 2: return True
   if n == 3: return True
   if n % 2 == 0: return False
   if n % 3 == 0: return False

   i = 5
   w = 2
   while i * i <= n:
      if n % i == 0:
         return False

      i += w
      w = 6 - w

   return True

It's a variant of the classic O(sqrt(N)) algorithm. It uses the fact that a prime (except 2 and 3) is of form 6k-1 and 6k+1 and looks only at divisors of this form.

Sometimes, If I really want speed and the range is limited, I implement a pseudo-prime test based on Fermat's little theorem. If I really want more speed (ie. avoid O(sqrt(N)) algorithm altogether) I precompute the false positives (see Carmicheals numbers) and do a binary search. This is by far the fastest test I've ever implemented, the only drawback is that the range is limited.

Alexandru
Strictly speaking Carmicheals are not sufficient. Your pseudo-prime test must also not inadvertently miss the right base required to disprove FLT. The strong pseudo-prime algorithm is superior in that there are no "Carmicheals" with respect to it, but you still cannot be sure unless you have checked at least one quarter of the range.If you are not range limited, then it would seem to me that using SPP + something like pollard rho to classify the vast majority of numbers of a first pass before using something more heavy duty is the right approach.
Paul Hsieh
@Paul Hsieh: modified the wording, do you find this better now?
Alexandru
+1  A: 

I have implemented a fast primality tester here:

primeat.zip

It works perfectly up to 32 bits, using the strong pseudo-prime test. It uses early out divisions via some clever gcds so that its fairly fast most of the time. I have checked it against a full sieve (which takes some 200mb to store) to make sure it is correct for the entire 32 bit range.

Paul Hsieh