Question: synch vs asynch in 'simultaneous' calls to different servers

This question is about the behavior of XMLHttpRequest in two specific programmatic structures.
I want to get information from a number of queries, and when I have it I want to
run a function doSomethingElse(). The issue is: doSomethingElse cannot be called until
ALL the queries have been run.

Please don’t be too nitpicking here. There are a lot of subtle issues I have not addressed, but
at the end, I have a definite question about the behavior of javascript in browsers.

It seems to me there are (at least) two possibilities:

Schematically,we have A (asynch):

var num = 10;
function doRecursive(num){
	if(num == -1) {
		doSomethingElse(); // UGLY
		return
	}
	set str for open, define xmlhttp, etc
	.... other stuff	
	
	if (this.readyState == 4 && this.status == 200) {
		.... stuff for successful call
		doRecursive(num-1);
		} // if
	if(this.readyState == 4 && this.status != 200){
		... stuff for unsuccessful call
		doRecursive(num-1);
	}
	xmlhttp.open("GET", str, true); // asynch
	xmlhttp.send();
}

And B (synch):

var num=10;
function doNonRecursive{num){
	for(var i=num;i>=0;i--){
		set str for open, define xmlhttp, etc
		.... other stuff	
	
		if (this.readyState == 4 && this.status == 200) {
			.... stuff for successful call
		        } 
		if(this.readyState == 4 && this.status != 200){
			... stuff for unsuccessful call
		}
		xmlhttp.open("GET", str, false); // synch
		xmlhttp.send();
	} // for
}
doSomethingElse(); // BETTER

The difference here is that, in A, the call to doSomethingElse() is nested deep
in a bunch of recursive calls to servers. It is horrible even to spaghetti-code steve here.

B is much more structured, and takes the doSomethingElse() outside the routine which gets
stuff from servers.

Is there a better way?

As far as I can tell, both of these serialize the calls.

In the absence of any browser-supported locking mechanism, serializing the calls is the only way to
be sure that all queries have been completed.

My question is: What might be reasons for preferring A or B?

Note: One could put in wait timers for each GET in a variety of ways,
so if the calls don’t complete in a certain amount of time, they are assumed to fail.
But this begs my question.

NOTE: A better way is to have each request run asynch, but also have a lock on a memory area
accessed by all the processes, so that when some counter reaches a predetermined
number (say), then the program can continue. However, it doesn’t appear that browsers
have such a mechanism available. One can fake a lock. Is there such a javascript library?

Thanks.

Promises are here to help with exactly this issue. Basically, a promise lets you chain asynchronous functions.

PromiseBasedAJAXCall('https://fancyserverplace.com')
    .then(function(data) {
       // play with data
    })
    .then(function(modifiedData) {
      // play with data the previous function returned
    })

There’s even a handy class function that lets you handle multiple calls, like what you’re describing in your question

Promise.all([promise1, promise2])
    .then(function(dataFromBoth) {
      var promise1Data = dataFromBoth[0];
      var promise2Data = dataFromBoth[1];
    })

Check this out for more details:

I’m not quite sure what you what to accomplish here, so let me first list my assumptions:

  • You want to make a series of AJAX requests to urls constructed from a number within doRecursive
  • After all requests have been returned you want to doSomethingElse with a cumulative result

First of all, the request processes you start will get back to you with a result and trigger an event you must assign a listener for: xmlhttp.onload, xmlhttp.onloadend or as you seem to want to use here xmlhttp.onreadystatechange.
So you’ll have to assign state handling function to an event handler. There is also an onerror handler, but it is responsible for error during transaction, not successfully received error-like status codes.

Second of all, javascript is single threaded (unless you spin up a Web Worker), so no memory locks. You send a request, script goes on to do other stuff, response arrives, dispatches an event, you handle it in an event handler(event listener).

Yes there is an option to handle requests synchronously, but it’s not recommended at all, would break onreadystatechange (as state changes multiple times during request processing, but calls to the handler would just get queued up, because they are executed on that same main thread, which waits and waits for our synch request). Also synch requests on the main thread are no longer supported in Firefox.

So what do you do? With you current setup you could keep a counter which you would increment each time a request is fullfilled, and doSomethingElse when all requests are done with (is num also the number of requests?):

//global
var reqLeft = num;

// inside wherever you dispatch requests
xmlhttp.onreadystatechange = function() {
  result =  this.responseText;
  // add it to cumulative result or whatever
  if(!--reqLeft) {
    doSomethingElse(cumulativeResult);
  }
}

Or use Promises. Actually just use Promises. If you want to specifically stick to XMLHttpRequest, then you’ll have to wrap it in a Promise yourself:

var pr = new Promise(function(resolve, reject) {
  // setup xmlhttp as usual

  xmlhttp.onload = function() {
    if (this.status == 200) resolve(this.responseText);
    else reject(this.status); //or something
  };

  xmlhttp.onerror = function() {
    reject(this.status);
  };
});

promises.push(pr);

// then for all promises
Promise.all(promises).then(function(resultArray) {...});

Or better yet use promise-ready newfangled Fetch or axios or something.