tags:

views:

1075

answers:

8
+3  A: 

BigInteger

KTC
+7  A: 

You should use BigInteger if you need to deal with ints outside the 32/64-bit space. Don't know whether there's practical limits on that that you'll need to worry about.

AlBlue
+2  A: 

A 1000 digit number uses less than 350 bytes of memory with BigDecimal. You should find you can handle numbers much larger than this.

The problem you will find is that you need to check alot of numbers, about 10^31 which will take a long time, about 10^18 years.

Peter Lawrey
A: 

You can use some ideas toghether.

  • The algorith 6k +/-1
  • Checking until divisor is less than sqrt(prime)
  • Checking dividing only by primes

Combining those methods you can speed up a lot your validation.

This link can help you: http://www.osix.net/modules/article/?id=791

And of course use BigInteger.

backslash17
+3  A: 

6k +/- 1 is not a primality test! If you already know the number Q is prime (and greater than 7), knowing that it's of the form 6k +/- 1 tells you whether it's "safe" -- that Q+1 and Q-1 will both have large factors, making Q more difficult to factor (thus "safe" for cryptographic purposes). But most numbers of the form 6k +/- 1 will be composite.

"Safe Prime" page from Wikipedia

If want to write your own routine for testing 1000 digit numbers for primality, you'll want to use the BigInteger class as the other answers have suggested. You could use the Fermat test first, which will tell you if the number is "definitely composite" or "probably prime". You could then use a more computationally intensive test like Miller-Rabin or Solovay-Strassen on the "probably prime" numbers for the final definitive test.

Primality testing algorithms from Wikipedia

Jim Lewis
+12  A: 

If it is sufficient to determine whether or not a number is PROBABLY prime, you can use the built in isProbablePrime function

  • if the call returns true, the probability that the number is prime exceeds (1 - 1/(2^certainty)).
  • If the call returns false, the number is definately not prime.
Chi
Hmmm. Am I the only one who finds it extremely disturbing that the javadoc for the probable-prime methods of BigInteger does not include documentation on which algorithm is used?
Jason S
@Jason - why does that disturb you? The purpose of javadoc is to help users of the API know how to use it, not to expose internal implementation details. The details are quite clearly spelled out in the source code.
CPerkins
+7  A: 

You should use a Lucas pseudoprime test and a Rabin-Miller strong pseudoprime test in bases 2 and 3. If all three give a result of probably prime, then for all practical reasons you should consider it so. There are no known counterexamples to this test. If you have to generate a certificate of primality, then you could use an elliptic curve primality prover, but it will be tremendously slower.

280Z28
+1 for awesomely interesting links!
chrispy
Note that Java's isProbablePrime(int) uses Miller/Rabin- and Lucas/Lehmer algorithms.
Bart Kiers
@Bart: where is that documented?
Jason S
+1  A: 

In case you don't like the probabilistic methods, there is a deterministic polynomial algorithm available as well.

But unless a deity is playing with the odds and pulling your leg or something, you should probably just use the probabilistic methods, which are faster.

yairchu
sweet, I'd never heard of that test before. Ramanujan would be proud....
Jason S