Tell us what’s happening:
In this tutorial, the overview says:
(decimal numbers are not integers)
It gives an example of an array of “real numbers” as [-3, 4.8, 5, 3, -3.2]
4.8 looks like a decimal to me.
What kind of decimals are not integers for javascript?
This page says there is no integer type for Number: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number
I can’t find any document describing what javascript thinks is a “real number”, an integer and a decimal so that I can sort out what each of these terms is defined to mean.
Thank you.
Your code so far
const squareList = (arr) => {
// Only change code below this line
return arr;
// Only change code above this line
};
const squaredIntegers = squareList([-3, 4.8, 5, 3, -3.2]);
console.log(squaredIntegers);
Your browser information:
User Agent is: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.122 Safari/537.36
.
Challenge: Use Higher-Order Functions map, filter, or reduce to Solve a Complex Problem
Link to the challenge: