Welcome to the Java Programming Forums

The professional, friendly Java community. 21,500 members and growing!

The Java Programming Forums are a community of Java programmers from all around the World. Our members have a wide range of skills and they all have one thing in common: A passion to learn and code Java. We invite beginner Java programmers right through to Java professionals to post here and share your knowledge. Become a part of the community, help others, expand your knowledge of Java and enjoy talking with like minded people. Registration is quick and best of all free. We look forward to meeting you.

>> REGISTER NOW TO START POSTING

Members have full access to the forums. Advertisements are removed for registered users.

# Thread: Just for fun - More Primality

1. ## Just for fun - More Primality

A little while back, I posted some code implementing the Miller-Rabin primality test and compared it vs. the Naive primality test (you can read that article here). The results were rather staggering for semi-large numbers. However, I did make the comment that 32-bits really isn't that big.

So, in keeping of the spirit of cocking about I took that code and implemented it to use BigIntegers ```	// these are my vain attempts to speed up the function by caching as much as
// possible :P
private static final BigInteger	B_ONE	= new BigInteger("1");
private static final BigInteger	B_ZERO	= new BigInteger("0");
private static final BigInteger	B_TWO	= new BigInteger("2");
private static final BigInteger	B_THREE	= new BigInteger("3");
private static Random			random	= new Random();

/**
* Determines if a number is probably prime using the Miller-Rabin primality
* test.
*
* @param number
* @param iterations
*            How accurate the test needs to be. Accuracy ~= 1 -
*            O(4^-iterations)
* @return false if definitely composite. true if probably prime.
*/
public static boolean millerRabinTest(BigInteger number, int iterations)
{
if (number.compareTo(B_ONE) <= 0)
{
// numbers less than or equal to 1 are not prime
return false;
}
else if (number.getLowestSetBit() >= 1)
{
if (number.bitLength() == 2)
{
// 2 is prime
return true;
}
// even numbers are not prime
return false;
}
else if (number.equals(B_THREE))
{
// 3 is prime
return true;
}
// write number - 1 as 2^s * d, with d odd by factoring powers of 2 from
// n-1
BigInteger nMinusOne = number.subtract(B_ONE);
int s = nMinusOne.getLowestSetBit();
// while (nMinusOne.and(B_ONE.shiftLeft(s)).equals(B_ZERO))
// {
// ++s;
// }
BigInteger d = nMinusOne.divide(B_ONE.shiftLeft(s));
// System.out.println("2^" + s + " * " + d);
// if (iterations > number - 4)
// {
// iterations = number - 3;
// }

BigInteger nMinusThree = number.subtract(B_THREE);
// r % (n-3) + 2
for (int i = 1; i <= iterations; ++i)
{
// pick a random integer a in the range [2, n-2]
BigInteger a = new BigInteger(nMinusOne.bitLength(), random).mod(nMinusThree).add(B_TWO);
// long a = generator.nextInt(number - 3) + 2;
// compute x=a^d % number, check to see if x==1 or x==number-1
BigInteger x = a.modPow(d, number);
if (x.equals(B_ONE) || x.equals(nMinusOne))
{
continue;
}
boolean gotoLoop = false;
for (int r = 1; r < s && !gotoLoop; ++r)
{
// x = x^2 % n
x = x.modPow(B_TWO, number);
if (x.equals(B_ONE))
{
return false;
}
else if (x.equals(nMinusOne))
{
gotoLoop = true;
break;
}
}
if (!gotoLoop)
{
// definately composite
return false;
}
}
// probably prime
return true;
}```

Now, there's no way I'm going to even try to run a naive algorithm against this algorithm in the range I had in mind, so I decided to compare it against Java's BigInteger built-in primality test If I'm guessing correctly, is implemented using the Solovay-Strassen test.

Here's the test rig:
```	public static void main(String[] args)
{
long start = System.currentTimeMillis();
String base = "FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF";
BigInteger startP = new BigInteger(base + "FF800", 16);
BigInteger endP = new BigInteger(base + "FFFFF", 16);
BigInteger data[] = new BigInteger[0xFFFF - 0xF800 + 1];
int counter = 0;
for (BigInteger i = startP; i.compareTo(endP) <= 0; i = i.add(B_ONE))
{
data[counter] = i;
++counter;
}
System.out.println("bit length: " + endP.bitLength());
int millerIterations = 4;
int builtinIterations = millerIterations * millerIterations;
for (int i = 0; i < data.length; ++i)
{
data[i].isProbablePrime(builtinIterations);
}
long end = System.currentTimeMillis();
System.out.println("High-range built-in: " + (end - start));
start = System.currentTimeMillis();
for (int i = 0; i < data.length; ++i)
{
millerRabinTest(data[i], millerIterations);
}
end = System.currentTimeMillis();
System.out.println("High-range miller-rabin: " + (end - start));
for (int i = 0; i < data.length; ++i)
{
boolean builtin = data[i].isProbablePrime(builtinIterations);
boolean miller = millerRabinTest(data[i], millerIterations);
if (builtin != miller)
{
System.out.println("different for " + data[i].toString() + ". Builtin: " + builtin + ". Miller-rabin: "
+ miller);
}
}
}```

Yeah, that right I tried it with numbers containing 1024 bits. To get a sense of the scale, the numbers I'm checking primality for are ~17e308. That's over 200 orders of magnitude larger than a google!

So how did these two methods fair?

Surprisingly, both methods came out with roughly the same execution time. This is because in practice the accuracy of these primality tests only require ~1-2 iteration to determine if a number is composite. Also, the majority of the time spent in both methods are doing basic math on these huge numbers (particularly multiplication, exponentiation, division, and modulo operations), which will dominate the test time. It's only the worst-case scenario where the Miller-Rabin test can give you the same certainty percentage at O(n) iterations vs. O(n^2) iterations, which rarely happens for either test.

However, I did find that the Miller-Rabin test was marginally faster (~1-2 seconds on average), as well as being more stable (the time taken between runs had a smaller standard deviation). Here were my outputs (times are in milliseconds):
bit length: 1024
High-range built-in: 11484
High-range miller-rabin: 10686
There were no differences output-wise between the two tests in terms of determining if a number was prime or composite.

So what now? Where else can speed be improved on? Honestly, the only potential place for major improvements I can think of is inside the BigInteger class. I'm not sure how it implements many of it's arithmetic operations, though if multiplication, division, modulo, and exponentiation are not implemented using a FFT (or similar method), there's 1 order of magnitude that could be improved. However, I don't think I'm going to go about writing my own BigInteger class for Java (at least not any time soon).

You're free to use this code anywhere you want, though I'm not responsible for any problems it may have. I would like to here about any problems, though. Maybe I'll get around to fixing them.

Happy Coding   Reply With Quote