# Sum of the first 100 even fibonacci numbers

The task is aforementioned. I’ve done a solution but I’m not sure where/how to check if I’m correct

``````function fibSum(n) {
var fib = [];
var even_fib = [];

for(let i = 0;  even_fib.length < n; i++){
if(i > 2) fib.push(fib[i-1] + fib[i-2])
else fib.push(1)

if(fib[i] % 2 == 0) even_fib.push(fib[i])

}
return even_fib.reduce((a, b) => a + b)

}

console.log([fibSum(100)])
``````

The code logs “6.833002762909235e+31”. Is this the precise answer I’m looking for or is this some approximate?

With a number of that size, you are above the max safe integer, so the result is approximate.

Two thoughts

1. Do you want the first n Fibs or the Fibs that are less than n

2. If you want that many Fibs, it is much more efficient to not dynamically allocate an array like that. You only need 3 variables each holding a single value

How would I go about displaying the exact result?

I want the first n even fibs, so [2, 8, 34, 144, 610…] up to the 100th element.

If you want ints that are larger than usual, you need to use BigInt.

Side note - `var` is a legacy feature. You should only use `const` and `let`.

I’ve acted upon your second point and created a solution that makes more sense.

``````function sumEvenFibs(n){
let previous = 0, current = 1
let result = 0;

while (n > 0){
current += previous
previous = current - previous
if(current % 2 == 0) {result += current; n--}
}
return BigInt(result)

}

console.log(sumEvenFibs(100))// returns 68330027629092356420402839289856n
``````

When I compare this to my earlier solution with arrays however, I start getting different
outputs for larger inputs, when I think the process is the same?

``````function fibSum(n) {
//n being the amount of even integers necessary. Specified to be 100 later on,

let fib = [];
let even_fib = [];
for(let i = 0; even_fib.length < n; i++){

if(i > 2) fib.push(fib[i-1] + fib[i-2])
else fib.push(1)
//Conditional to place seeded values.
if(fib[i] % 2 == 0) even_fib.push(fib[i])

}

return BigInt(even_fib.reduce((a, b) => a + b, 0))

}

console.log([fibSum(100)])
// returns
// 68330027629092347413203584548864n , as opposed to
// 68330027629092356420402839289856n from earlier

``````

I’m assuming the first solution is correct and precise? Is there an issue with trying to use arrays for storing numbers past the max safe integer?

you are converting an approssimated number to BigInt, that will not recover the missed precision, you need to use BigInt before going above the max safe integer, so you don’t loose precision

2 Likes

Thank you, noted.
Would initiating data types as BigInt from the beginning solve this?
I now have

``````function sumEvenFibs(n){
let previous = 0n, current = 1n
let result = 0n;

while (n > 0){
current += previous
previous = current - previous
if(current % 2n == 0) {result += current; n--}
}
return result

}
console.log(sumEvenFibs(100))
//  returns
// 290905784918002003245752779317049533129517076702883498623284700n
``````

I’ve logged out the sequence for when n is smaller, and the fibonacci sequence returns as expected, as well as the sum of the even digits (output 10 when n = 2, output 44 when n = 3, 144 when n = 4 etc).

I don’t know if that is the current result, but for sure it is precise and not approximated

1 Like

Thank you. Just to confirm, do you think there’s any problem with my logic? I’ve logged out values for previous, current and result for smaller numbers of n to see the sequence generate as expected.

This logic looks correct and its how I would do it (minus the mutation of `n`)