I was writing code to do this:
Write a JavaScript program to check whether there is at least one element which occurs in two given sorted arrays of integers.
but my original attempt didn’t work. Basically, I just looped through the first array, checking the second array each time to see if there was an indexOf value:
function commonEl(arr1, arr2) {
for (let item of arr1) {
if (arr2.indexOf(item)) {
return true;
}
}
return false;
}
console.log(commonEl([1,2,3], [5,6,7]))
It returned true! I thought that, since indexOf returns -1 if the parameter is not found, that this should work. When I used Array.includes(), the desired behavior was achieved, but I wonder why the indexOf failed?