views:

117

answers:

3

I have read most of the posts on here regarding floating point, and I understand the basic underlying issue that using IEEE 754 (and just by the nature of storing numbers in binary) certain fractions cannot be represented. I am trying to figure out the following: If both Python and JavaScript use the IEEE 754 standard, why is it that executing the following in Python

.1 + .1

Results in 0.20000000000000001 (which is to be expected)

Where as in Javascript (in at least Chrome and Firefox) the answer is .2

However performing

.1 + .2

In both languages results in 0.30000000000000004

In addition, executing var a = 0.3; in JavaScript and printing a results in 0.3

Where as doing a = 0.3 in Python results in 0.29999999999999999

I would like to understand the reason for this difference in behavior.

In addition, many of the posts on OS link to a JavaScript port of Java's BigDecimal, but the link is dead. Does anyone have a copy?

A: 

I would like to understand the reason for this difference in behavior.

  1. They're different languages.

  2. They use different underlying packages.

  3. They have different implementations.

When you say "Python" -- which implementation are you talking about? C, Jython, IronPython? Did you compare each of those?

The Javascript folks seem to handle repeating binary fractions differently from the way the Python folks handle repeating binary fractions.

Sometimes Javascript quietly suppresses the error bits at the end. Sometimes it doesn't.

That's the reason.

You have the source code for both. If you want to know more, you can. Knowing the source code doesn't change much, however.

S.Lott
+3  A: 

and printing.

They might both have the same IEEE 754 underlying representation, but that doesn't mean they're forced to print the same way. It looks like Javascript is rounding the output when the difference is small enough.

With floating point numbers, the important part is how the binary data is structured, not what it shows on the screen.

Stephen
+6  A: 

doing a = 0.3 in Python results in 0.29999999999999999

Not quite -- watch:

>>> a = 0.3
>>> print a
0.3
>>> a
0.29999999999999999

As you see, *print*ing a does show 0.3 -- because by default print rounds to 6 or 7 decimal digits, while typing an expression (here a is a single-variable expression) at the prompt shows the result with over twice as many digits (thus revealing floating point's intrinsic limitations).

Javascript may have slightly different rounding rules about how to display numbers, and the exact details of the rounding are plenty enough to explain the differences you observe. Note, for example (on a Chrome javascript console):

> (1 + .1) * 1000000000
  1100000000
> (1 + .1) * 100000000000000
  110000000000000.02

see? if you manage to see more digits, the anomalies (which inevitably are there) become visible too.

Alex Martelli
As always, a clear answer!
Steve
Thanks! That helps clear it up.
jeffmax329
jeffmax329