Hi! Does anyone know how it is possible that this simple code

…

var x = 2.05 / 0.05

console.log(x)

…

outputs 40.99999999999999 on freecodecamp and I have to correct it with Math.round?

Hi! Does anyone know how it is possible that this simple code

…

var x = 2.05 / 0.05

console.log(x)

…

outputs 40.99999999999999 on freecodecamp and I have to correct it with Math.round?

This is a so-called floating point error. Due to how float numbers are internally represented, calculations using them can be imprecise.

It may be this kind of error. It’s just that the calculation is so easy that even my head and the app on my phone can do it correct.

If such mistakes can happen, so I wonder how we can trust complex scientific results calculated on computers.

Here is a video you can check out.

Edit: BTW, this problem is not specific to JS, but here is some more JS specific info.

2 Likes

Same issue with decimal numbers that you might calculate in your head or on paper. With decimal, the issue normally arises when dividing by factors of 3 (or 7, but that’s a given): what is 10 ÷ 3^{†}? Lots of measurement systems use 12 as a base in an attempt to have the fewest possible ways this can happen (12 is cleanly divisible by 2, 3, 4 and 6. 10 is cleanly divisible by 2 and 5).

In this particular case you have floating point, the base is 2. You’re emulating base 10, but the calculation will still be base 2 underneath.

^{†} possibly a trick question, because ideally the answer can be 10/3. Or you can use notation to indicate that the answer is an infinitely repeating sequence. With real physical quantities (money for example) this doesn’t quite work. The numbers produced by calculations on a computer are always real physical quantities as well (they are finite, there’s a maximum amount of memory space they can take up).