I was asked to create a function that will be given an array of numbers. The set of numbers will either be all even numbers with one odd number, or all odd numbers with one even number.

The function needs to find this outlier number and return it.

My work:

```
var detectOutlierValue = function (array) {
// Implement your code below
posArr = array.filter(x => x%2 === 0);
negArr = array.filter(x => x%2 === 1);
if (negArr.length = 1) {
return Number(negArr);
}
if (posArr.length = 1) {
return Number(posArr);
}
}
console.log(detectOutlierValue([1, 3, 4, 7, 9, 11]));
// should be 4
console.log(detectOutlierValue([2, 4, 6, 10, 11, 14]));
// should be 11
```

The code as is prints 1 (incorrect) for the posArray and 11 (which is correct) for the negArr.

Now I know itâ€™s because I used the â€ś=â€ť operator instead of a â€ś==â€ť or â€ś===â€ť operator in the conditional statements.

Could someone explain to me why the code provided was return posArr as 1? And when I should ever use a â€ś=â€ť vs â€ś==â€ť operator? I feel like understanding this behavior will help me understand JS.