Tell us what’s happening:
can anyone explain why the the code does not work with any arr
with minimal value more than 1?

The for loop did not evaluate the remainder divided by min. I just don’t know
the reason. according to the code, the remainder divided by min was the first one be evaluated

Your code so far

function smallestCommons(arr) {
var max = Math.max(arr[0], arr[1]);
var min = Math.min(arr[0], arr[1]);
var i = max;
for (var j = min ; j < max + 1; j++){
if (i % j !== 0) {
i += 1;
j = min;
}
}
return i;
}
smallestCommons([23, 18]);

Your browser information:

Your Browser User Agent is: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.94 Safari/537.36.

Firstly some minor style points which significantly help readability:

Using a while loop is easier to read and understand what you’re trying to do, and is more idiomatic, rather than “resetting” the value of j.

Using more descriptive variable names, such as candidateValue or something instead of i. This helps readers (and perhaps yourself) to more easily reason about the code.

The only good time in my view to use single letter variables are as standard and idiomatic index variables, or as mathematical expressions.

Alrighty then! The reason your code is “skipping” the j = min case is because at the end of a for loop body, the final part of the for loop is executed.

So essentially what’s happening is the following:

j = min; j++;

and j becomes min + 1.

Use a while loop to avoid this, your code will be neater and more elegant, and the problem will be gone. (or a hacky and bad solution would be to change to j = min - 1; but please try the other way.

Hint: consider reading about break and continue and how they work.