What is the difference between these operators in JS?
I’ve found some incredibly stupid behavior with the single character versions.
To illustrate:
Code:
console.log(true&true)
Output:
1
Alright, this makes sense, if 1
represents true
and 0
represents false
, seems like there’s not a major difference.
Code:
console.log(true&17)
Output:
1
I guess since 17 exists and isn’t Null/NaN/etc. it makes sense.
Code
console.log(true&6)
Output:
0
Why are even numbers suddenly falsy?
I tried with the more “normal” && operator.
Code:
console.log(true&&6)
Output:
6
The same unexplained outcomes are seen with the OR operator.