views:

153

answers:

6

Is it possibly to do things asynchronously in javascript (AJAX aside)? For example, to iterate multiple arrays at the same time. How is it done? A brief example would be nice. Searching for this was hard, due to all the ajax pollution, which is not what I am looking for.

Thanks in advance.

+2  A: 

One new development in this field is HTML5 Web Workers.

Jaanus
Great, now there's a really convoluted new way to have me wonder why my processors are peaking when visiting a site! :P
Abel
@Abel I was thinking the same thing. Now a web page can peg all 4 cpus instead of just one.
Byron Whitlock
Yay, now we can burn all 4 CPU's with empty "for" loops! Get ready for some HOT laptops!
Mark Schultheiss
+6  A: 

Use web Workers. But remember that it is a very new feature and not all browsers are fully supported.

Teja Kantamneni
Are there any asynchronous features before html 5?
aepheus
@aepheus: Not really - JS was created as single-threaded, so while you can use setTimeout() as seen in Grumdrig's answer, the program execution runs in one thread only.
Piskvor
+4  A: 

You could use setTimeout.

setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);
setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);

I'm not sure how concurrent it will be, but it is an asynchronous programming model.

Grumdrig
+1 Good thinking, sometimes the good old simple things just do :)
Abel
I'm guessing this would still give synchronous execution (though as you state it would be in an asynchronous model). Array 1 will always complete, then array 2 will start, etc. I don't see much reason for this aside from making code confusing.
aepheus
Yes, I quite agree.
Grumdrig
+1  A: 

JavaScript is normally single threaded; you cannot do several things at once. If your JavaScript code is too slow, you will need to offload the work. The new way is to use web workers, as others have noted. The old way is often to use AJAX and do the work on the server instead. (Either with web workers or with AJAX, the arrays would have to be serialized and the result deserialized)

Kathy Van Stone
+1  A: 

As stated by Grumdrig you can write code like this:

setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);
setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);

But it will still not run concurrently. Here's a general idea of what happens after such timeouts are called:

  • Any code after the setTimeout calls will be run immediately, including returns to calling functions.
  • If there are other timers in queue that are at or past their delay or interval time, they will be executed one at a time.
  • While any timer is running, another might hit its interval/delay time, but it will not be run until the last one is finished.
  • Some browsers give priority to events fired from user interaction such as onclick and onmousemove, in which case the functions attached to those events will execute at the expense of timer accuracy.
  • This will continue until there is an opening (no previously called timers or event handlers requesting execution). Only then will the functions in the example code be run. Again one at a time, with the first one likely but not certainly executing first. Also, I'm venturing a guess that some browsers might impose a minimum delay time, which would make any timers set with a delay of 0 milliseconds be run even later than expected.

Obviously there is no performance advantage to running code like this. In every case it will make things take longer to complete. However in cases where a single task is taking so long it freezes the browser (and possibly trips "Script is taking too long" browser warnings), it can be helpful to break it up into smaller faster executing pieces that run sequentially after some delay time, thus giving the browser some time to breathe.

Web Workers have been mentioned, and if you are not concerned about IE compatibility then you can use them for true concurrency. However there are some severe limitations on their use imposed for security reasons. For one they cannot interact with the DOM in any way, meaning any changes to the page still must be done synchronously. Also all data passed to and from workers is serialized in transit, meaning true Javascript objects cannot be used. That being said, for intensive data processing, Web Workers are probably a better solution than breaking a function up into multiple timer delayed tasks.

MooGoo
Would it help to dovar tId1 = setTimeout(function () { iterateArray(array1); reportDone(1); }, 0);var tId2= setTimeout(function () { iterateArray(array2); reportDone(2); }, 0);
mplungjan
The only difference I see is the assignment of timer ids to variables. All the timer ids are really good for is canceling said timers. Kinda pointless for run-once 0ms timeouts.
MooGoo
A: 

I have to agree with MooGoo, i also wonder why you would run through such a big array in one go.

There's an extension to JavaScript called StratifiedJS, it allows you do multiple things at once as long as they're asynchronous. Also, webworkers are an awkward "solution" that just make things more complicated, also, they don't work in IE.

In StratifiedJS you could just write.

waitfor {
   // do something long lasting here...
}
and {
  // do something else at the same time...
}
// and when you get here, both are done
tomg