What is the difference between an integer and a real number?

Tell us what’s happening:

In this tutorial, the overview says:

(decimal numbers are not integers)

It gives an example of an array of “real numbers” as [-3, 4.8, 5, 3, -3.2]

4.8 looks like a decimal to me.

What kind of decimals are not integers for javascript?

This page says there is no integer type for Number: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number

I can’t find any document describing what javascript thinks is a “real number”, an integer and a decimal so that I can sort out what each of these terms is defined to mean.

Thank you.
Your code so far


const squareList = (arr) => {
// Only change code below this line
return arr;
// Only change code above this line
};

const squaredIntegers = squareList([-3, 4.8, 5, 3, -3.2]);
console.log(squaredIntegers);

Your browser information:

User Agent is: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.122 Safari/537.36.

Challenge: Use Higher-Order Functions map, filter, or reduce to Solve a Complex Problem

Link to the challenge:

Integers are a type of real number.

A page with a simple explanation of real numbers:

https://www.mathsisfun.com/numbers/real-numbers.html

Decimals are also real numbers but contain a decimal point (i.e. you can’t write them without a decimal point). Don’t let the ‘real numbers’ usage in the description of this challenge throw you. Real numbers are basically any number you can think of with a few exceptions. I think the description uses the term just to clarify that the array passed into the function will only be comprised of numbers (no strings, etc…)

In many programming languages there are different types for integers and decimals. If you know a variable is only going to contain an integer then you would declare the variable using an integer type:

int i = 10;

But if you know that the variable needed to hold decimal numbers then you would declare it with a type that can store decimals:

float d = 3.5; (the float type in C/C++ stores decimals)

JS doesn’t make this distinction. When you declare a variable that holds a number it can hold either an integer or a decimal. There is one exception to this in JS, for super big integers there is now a BigInt type, but there are limitations to its use and you certainly won’t need it for this challenge :slight_smile:

As far as determining whether a number is an integer or decimal, I’m not going to give you the answer because that is basically what the challenge is asking you to figure out. I will give you a hint that the Number object does provide a function I think you might find useful for this.

Thank you. I can see from the hints that dividing the number so there is no remainder is a step, but I didn’t understand why I would be doing that if 4.8 was an example of a real number.

Thank you.

Integers can be considered “whole” numbers - the span of discrete “countable” numbers from negative to 0 to positive. For example, -134, 0, and 175 are integers. What that linked article on Number says is that JavaScript doesn’t actually store these numbers as integers though. Other languages have an integer type, like C++ and Java for example, but JS converts these numbers to a floating point format instead.

Real numbers include all the numbers that might or might not have decimals. Integers are a subset of that, and are numbers that don’t have decimals.