views:

28

answers:

2

I am currently in the process of performance tuning a Web Application and have been doing some research into what is considered 'Good' performance. I know this depends often on the application being built, target audience, plus many other factors, but wondered if people follow a general set of rules.

There is always the risk with tuning that there is no end to the job, and one should at some point have to make a call one when to stop, but when is this? When can we be happy the job is done?

To kick off the discussion, I have been using the following rules, based on the Jakob Nielsen report (http://www.useit.com/alertbox/response-times.html), which says

The 3 response-time limits are the same today as when I wrote about them in 1993 (based on 40-year-old research by human factors pioneers):

0.1 seconds gives the feeling of instantaneous response — that is, the outcome feels like it was caused by the user, not the computer. This level of responsiveness is essential to support the feeling of direct manipulation (direct manipulation is one of the key GUI techniques to increase user engagement and control — for more about it, see our Principles of Interface Design seminar).

1 second keeps the user's flow of thought seamless. Users can sense a delay, and thus know the computer is generating the outcome, but they still feel in control of the overall experience and that they're moving freely rather than waiting on the computer. This degree of responsiveness is needed for good navigation.

10 seconds keeps the user's attention. From 1–10 seconds, users definitely feel at the mercy of the computer and wish it was faster, but they can handle it. After 10 seconds, they start thinking about other things, making it harder to get their brains back on track once the computer finally does respond.

A 10-second delay will often make users leave a site immediately. And even if they stay, it's harder for them to understand what's going on, making it less likely that they'll succeed in any difficult tasks.

Even a few seconds' delay is enough to create an unpleasant user experience. Users are no longer in control, and they're consciously annoyed by having to wait for the computer. Thus, with repeated short delays, users will give up unless they're extremely committed to completing the task. The result? You can easily lose half your sales (to those less-committed customers) simply because your site is a few seconds too slow for each page.slow for each page.

A: 

i would suggest this tool for increasing performance

firefox - firebug - pagespeed

i use this tool and it gives you suggestions

PokemonCraft
+1  A: 

The rules are pretty much sensible. Indeed one should aim to have response times in 1 second or less but sometimes the processing will really take longer (bad design, slow machines, waiting on 3rd parties, intense data processing, etc). In this case one can use various tips & tricks to improve the user experience:

  • use caching (both in the browser and in your frequently processed data)
  • use progressive loading of data using ajax where possible (and use progress indicators to give feedback that tings are happening)
  • use tools such as Firebug, YSlow to detect potential issues with your html design and structure etc etc
CyberDude