views:

100

answers:

4

I serve a web page which makes the client do quite a lot of Javascript work as soon as it hits. The amount of work is proportional to the amount of content, which varies a lot.

In cases where there is a huge amount of content, the work can take so long that clients will issue their users with one of those "unresponsive script - do you want to cancel it?" messages. In cases with hardly any content, the work is over in the blink of an eye.

I have included a feature where, in cases where the content is larger than some value X, I include a "this may take a while" message to the user which is displayed before the hard work starts.

The trouble is choosing a good value for X since, for this particular page, Chrome is so very much faster than Firefox which is faster than IE. I'd like to warn all users when appropriate, but avoid putting the message up when it's only going to be there for 100ms since this is distracting. In other words, I'd like the value for X to also depend on the browser's Javascript capabilities.

So does anyone have a good way of figuring out a browser's capabilities? I'm currently considering just explicitly going off what the browser is, but that seems hacky, and there are other factors involved I guess.

+2  A: 

This may not be where you want to go, but do you have a good idea why the javascript can take so long? Is it downloading a bunch of content over the wire or is the actual formatting/churning on the browser the slow part?

You might even be able to do something incrementally so that while the whole shebang takes a long time but users see content 'build' and thus don't have to be warned.

n8wrl
+3  A: 

If the data is relatively homogeneous, one method might be to have a helper function that checks how long a particular subset of the data has taken to go through, and make a conservative estimate of how long the entire set will take.

From there, decide whether to display the message or not.

Beska
Exactly what I was thinking.
Matt Sach
A: 

Why not just let the user decide what X is? (e.g. like those "display 10 | 20 | 50 | 100" per page choosers) Then you don't have to do any measurement/guesswork at all; you can let them make the optimal latency / information content tradeoff.

Jason S
A: 

This is somewhat misleading; usually when one discusses a browser's JS capabilities, it's referring to the actual abilities of the browser, such as does it support native XMLHTTP? Does it support ActiveX? etc.

Regardless, there is no way to reliably deduce the processing power or speed of a browser. One might think that you could run some simple stress-tests, compute the result and compare to a list of past performances to see where the current user's browser ranks, and possibly use this information to arrive at an estimated time. The problem here, is that these calculations can not only be influenced by activities in the browser (or merely on the OS); for instance, you run your profiling script, and the user's AV scanner starts up because its 5pm; what normally might take 2s, takes 20s.

On thing to ask yourself, is: Does this processing have to take place right NOW? As n8wrl and Beska alluded to, you might need to code your own method whereby you break-up the work to be done into chunks and then you operate on them one at a time and using something like setTimeout(). This will give the engine time to 'breathe' -- and thus hopefully avoid the 'unresponsive script' warnings. Each of these chunks could also be used to update a progress bar (or similar) that gives the user some indication that work is being done.

Or you could take the approach like GMail - they flash a very small, red "Loading..." text area in the corner of the window. Sometimes its there for a few seconds, sometimes it's not there long enough to read it. Other times it blinks on-and-off several times. But you know when its doing something.

Lastly, also on the point of incrementally 'building' the page, you could inspect the source of Chrome's new tab page. Note: you can't view this using "view source"; instead, choose the "javascript console" option (while on the new tab page) and then look at the HTML source there. There should be a comment that explains their general strategy, like such:

<!-- This page is optimized for perceived performance. Our enemies are the time
 taken for the backend to generate our data, and the time taken to parse
 and render the starting HTML/CSS content of the page. This page is
 designed to let Chrome do both of those things in parallel.

 1. Defines temporary content callback functions
 2. Fires off requests for content (these can come back 20-150ms later)
 3. Defines basic functions (handlers)
 4. Renders a fast-parse hard-coded version of itself (this can take 20-50ms)
 5. Defines the full content-rendering functions

 If the requests for content come back before the content-rendering functions
 are defined, the data is held until those functions are defined. -->

Not sure if that helps, but I think it does give insight into how some of the big players handle challenges such as this.

ken