tags:

views:

77

answers:

2

My large JavaScript application is now pushing 35,000 function calls every 4 seconds. Performance is still ok (on a 1.6ghz Atom), but is there a point where browsers start to stop working?

+4  A: 

There is always a limit to any finite system.

It's impossible to say what that number will be for you because there are just too many variables here, not least of all what your functions are actually doing. I would hazard to suggest though that if you're needing to make ~10K calls a second it is unlikely that this is a good thing - have you considered refactoring some of this behaviour out, or pushing it to another [server-side] layer which might be able to handle this workload better?

annakata
A: 

Most browsers will stall (behave like a single-threaded app), and many will prompt the user with an option to kill a running script if it blocks for too long (timeout depends on the browser). This may be an issue for you.

One option that may help your situation (I don't know the details of what you're trying to do) is to periodically release control back to the browser. Example:

function do_everything() {
  foo(); // takes a while
  bar(); // also takes a while

Could be converted to

function do_everything() {
  foo();
  setTimeout( bar, 10 ); // delay 10ms
}

This gives the browser time to breathe, but is not a universal solution. For instance -- do_everything() will return before completing bar (which will happen at least 10ms later). Since there's no 'sleep' or 'yield' function in javascript, you can't really stop execution mid-function without a busy loop (locks down browser).

etoleb