I understand there is an infinite loop inside this code example, but I don’t understand why the end result (I tested) is 11. How does it get 11?
function foo() {
function bar(a) {
i = 3;
console.log( a + i );
}
for (var i=0; i<10; i++) {bar( i * 2 );}
}
foo();
- 1st of all
a
did not even got assigned to any value. How does the compiler continue its operation? Or a=i*2;
is the case?
- How come
i++
does not work? If code below is executed, you will get result like 6,8,10,12,14,16,18
for (var i=3;i<10;i++){
console.log(i*2);
}
- Even let’s say i++ does work, why do we get 11? And get 11 only?
In the first iteration of the for loop, i = 0, so we have bar(0 * 2) or bar(3). Inside bar, i = 3, so we have console.log(a + i) or console.log(0 + 3), which displays 3 to the console.
In the next iteration of the for loop, i = 4 (because i was modified in bar to become 3 and then the i++ in the for loop makes it 4). So we have bar (4*2) or bar(8). Inside bar, i = 3 again, so we have console.log(a+i) or console.log(8 + 3) which displays 11 to the console.
In the next iteration (and all remaining) of the for loop, i = 4 (because i was modified in bar to become 3 and then the i++ in the for loop makes it 4)… 11 is displayed forever because 4 will always be less than 10 (the for loop exit condition).
If the function bar would have declared its own i ( var i = 3), you would have seen the following in the console:
3
5
7
9
11
13
15
17
19
21
1 Like
I was revising this code snippet just now and I want to ask you this:
Previously I thought the logic behind this code went something like this:
(WRONG ONE)
. assgin 3 to i
. bar(3*2)
. console.log(9)
. etc etc
And now I realised that it was wrong because i=3
was not a declaration, it was an assignment. Js put all the declarations at 1st when it comes to executing the code you write. (a.k.a hoisting)
Am I right?