indexOf vs includes question

I was writing code to do this:

Write a JavaScript program to check whether there is at least one element which occurs in two given sorted arrays of integers.

but my original attempt didn’t work. Basically, I just looped through the first array, checking the second array each time to see if there was an indexOf value:

function commonEl(arr1, arr2) {
         for (let item of arr1) {
             if (arr2.indexOf(item)) {
                 return true;
             }
         }
         return false;
     }
console.log(commonEl([1,2,3], [5,6,7]))

It returned true! I thought that, since indexOf returns -1 if the parameter is not found, that this should work. When I used Array.includes(), the desired behavior was achieved, but I wonder why the indexOf failed?

indexOf returns a number, the index at which the item is found, or -1 if there is not such an element in the array

of the numbers, 0 is falsy, all the others are truthy - so you have the same behaviour when the item is not present, or present at index 1 or higher

you need to check the value returned from indexOf, not just rely on its truthiness/falsiness

Oh, ouch, for some reason I thought -1 was falsy!