Math.min() method

var arr = [4,6,3,7,6,10,8,11,8,6]

var min1 = i => Math.min(...i)
var min2 = i => Math.min(i)

console.log([min1(arr), min2(...arr)])  // returns [3, 4] ?

What’s going on here? How come min1(arr) returns 3, as expected, but min2(…arr) returns 4?

Ooooo this is a fun one.

The “gotcha” here is in your min2(...arr) call. ...arr spreads the values of arr, which you then pass in to your min2 function. If we write it out, it looks like:


Notice how it’s not an array - it’s separate arguments being passed in to the function. min2 is only set to handle one argument i, which gets passed in to your Math.min(i) call. In this case, that’s Math.min(4).


Ahhhh as opposed to min1 which is set to handle an array then spread it itself? As a subroutine of the function?

Correct - min1 is given an array as its first argument, which it passes to Math.min() and spreads it.

Math.min() can handle a variadic number of arguments, so using the spread operator there isn’t an issue.


Understood. Thanks a ton!

1 Like

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.