Numbers give unexpected result (decimals)

Hi all,

I am currently working on a function that takes the sum of different values in a multidimensional array.
I have the following code:

  cidSum() {
    for(let arr in this._cid) {
      console.log(this._cid[arr][1]);
      this._cidSum += this._cid[arr][1];
      console.log(this._cidSum);
    }
  }

Initial values of this._cid and this._cidSum:

this._cidSum = 0;
this._cid = [["PENNY", 1.01], ["NICKEL", 2.05], ["DIME", 3.1], ["QUARTER", 4.25], ["ONE", 90], ["FIVE", 55], ["TEN", 20], ["TWENTY", 60], ["ONE HUNDRED", 100]];

The results of the console logs where the first line is always the amount to be summed and the next line the new value after taking the sum.

1.01
1.01
2.05
3.0599999999999996
3.1
6.16
4.25
10.41
90
100.41
55
155.41
20
175.41
60
235.41
100
335.40999999999997

My problem is with 3.0599999999999996 and 335.40999999999997.

  1. What is causing these numbers to deviate from one or two decimals as they are always summed by one or two decimals?

  2. Is there an efficient way to solve this (currently I would use math.round).

Thank you for your help,
Tom

A simple explanation:
http://adripofjavascript.com/blog/drips/avoiding-problems-with-decimal-math-in-javascript.html

The article suggests using integers, but when dealing with money you might just as well round to two decimals.

2 Likes

Thank you for the article, that explains a lot!

1 Like