views:

89

answers:

3

I am just trying to implement a simple RNG in JS.

What's happening is javascript evaluates 119106029 * 1103515245 to be 131435318772912110 rather than 131435318772912105. We know it's wrong since two odd numbers multiplied does not give an even number.

Anyone know what's up? I just want a reliable repeatable RNG, and because of these incorrect values I can't get results to match up with my C implementation of the same thing.

+3  A: 

When an integer in javascript is too big to fit in a 32 bit value, some browsers will convert it to a floating point. Since the value of floating points is only save to a limited precision, some rounding can occur on big values.

Scharrels
The result is correct to the last two bits though. It would make sense if it just truncates upper bits but all I want is the lower 32 bits of the result anyway. How do I get that?
Steven Lu
@Steven: Javascript doesn't really have an integer type, in this case you're seeing precision loss in floating point numbers. And those retain the "upper part" of the number, not the low bits (like integer multiplication would do).
Joey
@Johannes: Okay, that makes a lot of sense. Wonder how I can work around this...
Steven Lu
+6  A: 

Per the ECMAScript standard, all numbers in JavaScript are (64-bit IEEE 754) floating-point numbers.

However all 32-bit integers can be exactly represented as floating-point numbers. You can force a result to 32 bits by using the appropriate bitwise operator, like this:

x = (a * b) >>> 0;  // force to unsigned int32
x = (a * b) | 0;    // force to signed int32

Weird, but that's the standard.

(Incidentally this rounding behavior is one of the most frequently reported "bugs" against Firefox's JavaScript engine. Looks like it's been reported 3 times so far this year...)

As for reproducible random numbers in JavaScript, the V8 benchmark uses this:

// To make the benchmark results predictable, we replace Math.random
// with a 100% deterministic alternative.
Math.random = (function() {
  var seed = 49734321;
  return function() {
    // Robert Jenkins' 32 bit integer hash function.
    seed = ((seed + 0x7ed55d16) + (seed << 12))  & 0xffffffff;
    seed = ((seed ^ 0xc761c23c) ^ (seed >>> 19)) & 0xffffffff;
    seed = ((seed + 0x165667b1) + (seed << 5))   & 0xffffffff;
    seed = ((seed + 0xd3a2646c) ^ (seed << 9))   & 0xffffffff;
    seed = ((seed + 0xfd7046c5) + (seed << 3))   & 0xffffffff;
    seed = ((seed ^ 0xb55a4f09) ^ (seed >>> 16)) & 0xffffffff;
    return (seed & 0xfffffff) / 0x10000000;
  };
})();
Jason Orendorff
+1  A: 

If done in C/C++ (double), the last numbers will be ...112 instead of 105 (which is correct). If performed with 'long double', the result will be as expected (...105). So it looks like the Javascript interpreter converts the numbers to 8-byte-double internally, does the calculation and does some unknown rounding which leads to a marginally better result than the C/C++ standard double calculation.

GCC 4.5:

 int main(int argc, char** argv)
{
 long double a = 119106029;
 long double b = 1103515245;
 long double c = a * b;
 printf("%.Lf\n", c);

 return 0;
}

Result:

131435318772912105

Expected:

131435318772912105

So I don't see a chance in Javascript without the aid of a BIGNUM library (if any).

Regards

rbo

rubber boots