Hi. I am having some difficulty understanding this challenge.

In the example. It is my understanding that ourMax is equal to 9 and ourMin is equal to 1.

// Example

```
function ourRandomRange(ourMin, ourMax) {
return Math.floor(Math.random() * (ourMax - ourMin + 1)) + ourMin;
}
ourRandomRange(1, 9);
```

Now in the return statement the section to the right of the multipication symbol reads like this to me. (9-1+1) + 1, which to me equates to 10.

Math.floor rounds down to nearest whole integer correct?

Math.random() gives a float between 0 and .9999 correct?

So does Math.floor round down to zero?

If so this should result in 0 * 10 which equates to zero.

But the console logs 1.

I really don’t get it.

Can someone please clear this up for me?