Smallest Common Multiple Final Condition seems to be incorrect

Tell us what’s happening:

Your code so far


function smallestCommons(arr) {
  var orderedArr = arr.sort();
  var arrNum = [];
  var smallestCommonMultiple = 0;
  for(var i = orderedArr[0]; i <= orderedArr[arr.length-1]; i++){
    arrNum.push(i);
  }
  var biggestCommonMultiple = arrNum.reduce(function(a,b){
    return a*b;
  });
  for(var k = 0; k < arrNum.length; k++){
    for(var j = 1;j<=biggestCommonMultiple; j++){
      if(j%arrNum[k]===0){
      smallestCommonMultiple = j;
      }
    }
  }
   console.log(smallestCommonMultiple)
}


smallestCommons([5,1]);

Your browser information:

User Agent is: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36.

Link to the challenge:
https://learn.freecodecamp.org/javascript-algorithms-and-data-structures/intermediate-algorithm-scripting/smallest-common-multiple

Hi everyone,
I think I was doing everything correctly until the last step which seems to be multiplying all the numbers in the given range and then it gives me the result(obviously, an incorrect one). I wonder how should I change this last condition in order to make it work nicely?

Thanks in advance.

The last one is not console.log but return, of course.