**Tell us what’s happening:**

So the logic I followed is the common prime logic(I guess it’s correct, maybe):

if the number (i) is 2 or 3, then it’s prime and push into the array. If n > 3, then I divide it by every integer between 2 and square root of n and, if I get a positive remainder every time, then it’s prime. then at the end, I sum them all the array to get the result which is the sum of the primes.

I feel that the mistake is in the second loop(the loop which I use to see if i is prime or not) but I don’t know. Any help please?

**Your code so far**

```
function sumPrimes(num) {
var arr = [];
for (var i = 2; i < num; i++) {
if(i === 2 || i === 3) {
arr.push(i);
}
if (i > 3) {
for (var j = 2; j < Math.sqrt(i);j++) {
if (i % j > 0) {
arr.push(i);
}
else {
break;
}
}
}
}
console.log(arr);
return arr.reduce((before,after) => before + after);
}
sumPrimes(10);
```

**Your browser information:**

User Agent is: `Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:60.0) Gecko/20100101 Firefox/60.0`

.

**Link to the challenge:**

https://learn.freecodecamp.org/javascript-algorithms-and-data-structures/intermediate-algorithm-scripting/sum-all-primes/