Hello,
I was wondering why my code wouldn’t sufficiently solve the problem. exercise: find least common multiple of two numbers (arr) and all numbers in between them.
Here is the code.
function smallestCommons(arr) {
let tween = []
if (arr[0] > arr[1]){
for (let i=arr[1]; i <= arr[0]; i++){
tween = tween.push(i)
}
// made new array of all the numbers from arr[1] to arr[0]
}
let n = 0;
var a = 0;
for (var n = (arr[0]*arr[1])+1; n < (arr[0]*1000); n++){
for (var a = 0; a < tween.length; a++){
if (n%arr[0] === 0 && n%arr[1] === 0 && n%tween[a] ===0 ){
return n}
}
}return n
//start at both numbers multiplied together + 1, go to bigger number x 1000. just for good measure to get a test case to make sure I'm on right track. Then check every number to make sure it divides by both starting numbers as well as all numbers in new array tween. return that number if it checks out.
}
smallestCommons([1,5]);
``````````````````````````````````````````````````````````````````````````````````````````````````````````
I realize that there is the whole set of cases where arr[1] > arr[0] but I can't get this way to work for any of the test cases. Also does anyone know why I can't use console.log() anymore to print stuff to the console?
Thanks!