Solution:
function XO(str) {
var x = str.match(/x/gi)
var o = str.match(/o/gi)
return (x && x.length) == (o && o.length)
}
Why is the x && x.length or o && o.length necessary? I tired just x.length && o.length and this didn’t seem to work. I know it’s in reference to the case in which they are empty arrays, but what is it about say “null && null.length” that’s missing from just “null” ?
I’m trying to visualise this by running things like “null == 5” and comparing it to “(null && null.length) == ([o, o, o, o, o] && 5)”. Why does only the latter return the expected false?