Scope Problem? I don't see why

Working on the Twitch.tv challenge. This is related to a scope problem I am having.

My question is about where I put the $("#offline").html(offlineHTML);
It only works within the .forEach loop, but not outside of it. Obviously, the logical way to do this is after the for loop completes.

In Javascript/JQuery, I have:

$(document).ready(function() {
  var streamers = ["ESL_SC2", "OgamingSC2", "cretetion", "freecodecamp", "storbeck", "habathcx", "RobotCaleb", "noobs2ninjas", "brunofin"];
  var apiStr = "https://wind-bow.gomix.me/twitch-api/streams/";
  var callbackStr = "?callback=?";
  var offlineHTML = "";
  var onlineHTML = "";

  function addOffline(data, name) {
    console.log("adding... " + name);
    offlineHTML += '<div id="offlineStreamer">';
    offlineHTML += name;
    offlineHTML += " is offline"
    offlineHTML += "</div>";
  };

  function addOnline(data) {
    console.log("online");
  };

  function parseTwitch(arr) {
    arr.forEach(function(strStr) {
      var URL = apiStr + strStr + callbackStr;
      $.getJSON(URL, function(data) {
        if (data.stream !== null) {
        } else {
          addOffline(data, strStr);
          $("#offline").html(offlineHTML) // Only works within for loop. 
        };
      });
    }); // closing for loop
    
    // $("#offline").html(offlineHTML); // doesn't work if I uncomment this.  Why? offlineHTML scope should include this
    
  };
  parseTwitch(streamers);
});

I am sure this is basic scope issue, can someone explain.

It’s because $.getJSON is an asynchronous function which means the rest of the code doesn’t wait for it.

It’s not the forEach loop the assignment needs to be in but the callback function

$.getJSON(URL, function(data) {
if (data.stream !== null) {…

All other javascript execution is done long before getJSON is able to retrieve the data from Twitch, and only the callback is called once data is returned.

1 Like

:sob:

So does that mean I have to use .ajax() and set async to false?

Or is there someway to force .getJSON to do this?

Nope, you’re doing it correctly.

These functions operate asynchronously on purpose. The idea is that you can design a clean, responsive interface that doesn’t “stutter” or “buffer” while the HTTP calls are being made. You can display/animate loading icons, messages, etc instead of causing the whole page to lock up.

Then, when the HTTP call completes (or fails) you can use the callback function to update the nice, responsive page for the user with the newly received data.

If you don’t like the way everything is nested, you can always call functions from your callback. For example:

 function parseTwitch(arr) {
    arr.forEach(function(strStr) {
      var URL = apiStr + strStr + callbackStr;
      $.getJSON(URL, function(data) {
        if (data.stream !== null) {
        } else {
         processMyData(data);
        };
      });
    }); // closing for loop        
  };

  processMyData(data){
          addOffline(data, strStr);
          $("#offline").html(offlineHTML) // Only works within for loop. 
  }

The page will respond poorly if you make the call synchronous.

I guess this looks very inefficient to me, since $("#offline").html(offlineHTML) is run within the for loop. It would seem more efficient to collect everything, then do $("#offline").html(offlineHTML), just once outside the for loop after all the $.getJSON() is done.

But I guess given the async nature you mentioned, not sure how to do this outside the for loop.

I guess I can always do it inefficiently within the for loop. (shrug)

You make a good point, and you’re right for considering jQuery element insertions as “expensive”.

One solution would be to store the getJSON return data in a global object use a conditional statement to determine if all the requests are complete. Doing that, you could perform the insertion only once. after the last request returns.

Another consideration is whether the user would rather see all rows updated at once, or rows updated as they’re retrieved. Element insertion is inefficient but getJSON is (much) slower. I’m not a UX professional, so I couldn’t tell you :slight_smile: but they’re equally valid approaches.

I think the solution is just to do a .prepend().