Comparison with the Inequality Operator Example

This isn’t a question about the code. My question is about the example given.

1 != 2 // true
1 != “1” // false
1 != ‘1’ // false
1 != true // false
0 != false // false

I don’t understand this part of the example:

1 != true // false
0 != false // false

Since 0 is not equal to true and 1 is not equal to false, shouldn’t “true” be returned? What am I missing?

Your code so far

// Setup
function testNotEqual(val) {
  if (val != 99) { // Change this line
    return "Not Equal";
  return "Equal";

// Change this value to test

Your browser information:

User Agent is: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/11.1.1 Safari/605.1.15.

Link to the challenge:

Quoted from MDN - Comparison operators :

The inequality operator returns true if the operands are not equal. If the two operands are not of the same type, JavaScript attempts to convert the operands to an appropriate type for the comparison

If you transform 1 or 0 in boolean you will have true and false respectively. So:
0!== false // true
0!= false // false, since it’s converted


You have just stumbled across one of the most hated parts of javascript — type coercion

When you use
loose equality ==
loose inequality !=

javascript will first check if the types are the same. If they’re not, it attempts to coerce the types into similar types using some mystical, internal logic* handed to them by magical elves riding on unicorns**.

For this particular instance js will coerce boolean into Number(boolean). So the actual comparison being made is

1 != Number(true) -> 1 != 1 // false
0 != Number(false) -> 0 != 0 // false

And this is why the best practice is to only use

strict equality ===
strict inequality !==

*well not really mystical logic. You can see how in the section ‘Loose equality ==’ in the link below.
**elves and unicorns may have still been complicit