This question is about the behavior of XMLHttpRequest in two specific programmatic structures.
I want to get information from a number of queries, and when I have it I want to
run a function doSomethingElse(). The issue is: doSomethingElse cannot be called until
ALL the queries have been run.
Please don’t be too nitpicking here. There are a lot of subtle issues I have not addressed, but
at the end, I have a definite question about the behavior of javascript in browsers.
It seems to me there are (at least) two possibilities:
Schematically,we have A (asynch):
var num = 10;
function doRecursive(num){
if(num == -1) {
doSomethingElse(); // UGLY
return
}
set str for open, define xmlhttp, etc
.... other stuff
if (this.readyState == 4 && this.status == 200) {
.... stuff for successful call
doRecursive(num-1);
} // if
if(this.readyState == 4 && this.status != 200){
... stuff for unsuccessful call
doRecursive(num-1);
}
xmlhttp.open("GET", str, true); // asynch
xmlhttp.send();
}
And B (synch):
var num=10;
function doNonRecursive{num){
for(var i=num;i>=0;i--){
set str for open, define xmlhttp, etc
.... other stuff
if (this.readyState == 4 && this.status == 200) {
.... stuff for successful call
}
if(this.readyState == 4 && this.status != 200){
... stuff for unsuccessful call
}
xmlhttp.open("GET", str, false); // synch
xmlhttp.send();
} // for
}
doSomethingElse(); // BETTER
The difference here is that, in A, the call to doSomethingElse() is nested deep
in a bunch of recursive calls to servers. It is horrible even to spaghetti-code steve here.
B is much more structured, and takes the doSomethingElse() outside the routine which gets
stuff from servers.
Is there a better way?
As far as I can tell, both of these serialize the calls.
In the absence of any browser-supported locking mechanism, serializing the calls is the only way to
be sure that all queries have been completed.
My question is: What might be reasons for preferring A or B?
Note: One could put in wait timers for each GET in a variety of ways,
so if the calls don’t complete in a certain amount of time, they are assumed to fail.
But this begs my question.
NOTE: A better way is to have each request run asynch, but also have a lock on a memory area
accessed by all the processes, so that when some counter reaches a predetermined
number (say), then the program can continue. However, it doesn’t appear that browsers
have such a mechanism available. One can fake a lock. Is there such a javascript library?
Thanks.