Ok why do i say this when they pass the tests …
While refactoring some of my code and looking at other ways of doing the smallest common multiple problem and comparing results i realized both of these solutions will give wrong results with certain groups of numbers
eg if you do smallestCommons([24,43]) it gives 1228641933756837400
… i worked it out to be 409547311252279200
in fact you get the same result for range [25,43] … 409547311252279200 in which it dose do correctly.
Also tried another combination smallestCommons(100,110) 2954551107310803000 (55 times bigger than should be) … i worked it out to be 590910221462160600 …
dose range [100,109] correctly.
I checked my results for what i believe to be correct from two versions of smallest common programs i wrote and by also separately on excel.
Now im presuming there are more ranges that these will fail on … which brings me to another question.
How to adequately test algorithms like this that could end up using ridiculously big numbers and could end up being used in the real world doing important tasks.
Now I know its not the end of the world that i think these are in my eyes faulty solutions (pity as the are short clear and fast) … The main thing is how to test a algorithm like this so you can be guaranteed it works for all conditions … and is it possible for something like this.
var x = Number.MAX_SAFE_INTEGER;
var y = x + 1000;
console.log("x = " + x);
// x = 9007199254740991
console.log("y = " + y);
// y = 9007199254741992
console.log("difference = " + (y-x));
// difference = 1001
This is what I get when I run it in codepen on Windows 10 with Chrome. Above a certain number, integers can’t be trusted. That’s in the high 16 digits. Your number 1228641933756837400 is 19 digits.
Any integer calculations that high in JS can’t be trusted. JS is not made for advanced mathematical calculations that high. If you want something that exact, you’ve either got to use a library, design your own solution, or use a language that allows for more precision in large numbers.