Why is 9999999999999999 converted to 10000000000000000 in JavaScript?

Solution 1:

Javascript doesn't have integers, only 64-bit floats - and you've ran out of floating-point precision.

See similar issue in Java: why is the Double.parseDouble making 9999999999999999 to 10000000000000000?

Solution 2:

  1. JavaScript only has floating point numbers, no integers.

  2. Read What Every Computer Scientist Should Know About Floating-Point Arithmetic.

    Summary: floating point numbers include only limited precision, more than 15 digits (or so) and you'll get rounding.

Solution 3:

9999999999999999 is treated internally in JavaScript as a floating-point number. It cannot be accurately represented in IEEE 754 double precision as it would require 54 bits of precision (the number of bits is log2(9999999999999999) = 53,150849512... and since fractional bits do not exist, the result must be rouned up) while IEEE 754 provides only 53 bits (1 implict bit + 52 explicitly stored bits of the mantissa) - one bit less. Hence the number simply gets rounded.

Since only one bit is lost in this case, even 54-bit numbers are exactly representable, since they nevertheless contain 0 in the bit, which gets lost. Odd 54-bit numbers are rounded to the nearest value that happens to be a doubled even 53-bit number given the default unbiased rounding mode of IEEE 754.