I have read most of the posts on here regarding floating point, and I understand the basic underlying issue that using IEEE 754 (and just by the nature of storing numbers in binary) certain fractions cannot be represented. I am trying to figure out the following: If both Python and JavaScript use the IEEE 754 standard, why is it that executing the following in Python
.1 + .1
Results in 0.20000000000000001 (which is to be expected)
Where as in Javascript (in at least Chrome and Firefox) the answer is .2
However performing
.1 + .2
In both languages results in 0.30000000000000004
In addition, executing var a = 0.3; in JavaScript and printing a results in 0.3
Where as doing a = 0.3 in Python results in 0.29999999999999999
I would like to understand the reason for this difference in behavior.
In addition, many of the posts on OS link to a JavaScript port of Java's BigDecimal, but the link is dead. Does anyone have a copy?