views:

172

answers:

5

We all know that performance is a feature. Slow load times are just about the fastest way to turn me off of a site. But at what point is a web app* fast enough?

  • What is the latency threshold between "Hell with this" slow and "Ahhh..." fast? If this is too application-specific, then how do you determine this threshold for your app?

  • Does this threshold differ for uncached page loads and things like drop-down menus?

  • Does crossing this threshold actually make a statistically significant difference to your users' satisfaction?

  • Does making your application even faster have an effect on your users' satisfaction?

Note that even though I'm using the word threshold, I'm referring to a point of diminishing returns (see the last question): the fuzzy line where the user no longer really notices or thinks about your load time.


*The web app in this question is any generic business-class application. I'm thinking of something like Sales Force that you wouldn't generally choose by yourself but have to interact with on a daily basis, but feel free to answer with specifics on your own application.

** This question will evolve as necessary - I realize I"m asking some things that are difficult (at best) to answer.

+13  A: 

This question is, at least, subjective. But I found some numbers here

  • Zona research said in 1999 that you could lose up to 33% of your visitors if you page took more than 8 seconds to load.
  • Akamai said in 2006 that you could lose up to 33% of your visitors if your page took more than 4 seconds to load on a broadband connection.
  • Tests done at Amazon in 2007 revealed that for every 100ms increase in load time, sales would decrease 1%.
  • Tests done at Google in 2006 revealed that going from 10 to 30 results per page increased load time by a mere 0.5 seconds, but resulted in a 20% drop in traffic.

This link can also be useful: Response Time: Eight Seconds, Plus or Minus Two

Rubens Farias
wow! that google stat is mind blowing.
Hogan
Fantastic information. This is exactly the sort of numbers I'm looking for.
sh-beta
+1  A: 

I've always used this document as a guide.

http://developer.yahoo.com/performance/rules.html

I think you have to look at your userbase to really understand some of these questions. e.g. facebook has tons of javascript on a page but less "graphics". Other pages have heavy graphics and less "javascript". Your target market will decide what pieces you need in your application and how performance is going to play a part.

Mech Software
A: 

As a thought, even fast pages/apps feel better when faster. for instance, I never noticed a problem with load times at http://docs.jquery.com until they updated it.

Point: I don't think that your "threshold" exists absolutely, and that all speed improvements are beneficial. I think the place to stop is some point of diminishing returns, instead of some absolute time in milliseconds.

cobbal
Except in those applications where you have a hard limit of 33 milliseconds. =)
Crashworks
+5  A: 

Our rule is that if a page takes longer than 1 second to render then we have a serious problem. Now, to be clear, I'm talking about when the client has DSL or better. Typically, our page times are in the 150ms to 200ms range.

Proper coding with the right amount of hardware should always result in a site that performs well.

You should note that there are a ton of things might interfere. Network conditions is a big one. If the client's network is dog slow or not provisioned correctly (meaning they have 100 people sharing a T1) then there isn't much you can do. However, you do have control over your own code and typically your side of the network equation.

UPDATE for rockinthesixstring
Things we do to make our web apps scream.

  1. We do NOT use any ORM products. Yes they will make your development go faster; but there is not an ORM out there that is better at tweaking SQL than we are. Even LINQ to SQL requires you to know a lot about SQL server in order to use it properly. From our perspective it's just not worth it.

  2. We do NOT use embedded SQL anywhere. All coding is done through s'procs. Besides adding another layer of security, we can very easily tweak the sql calls in flight without impacting the underlying code. For example, one guy here had a page that started off pretty fast. It was just paging and sorting through some records. However, when we tested it against 100k records (paging 20 at a time) it was taking close to 4 seconds to load each page. Some tweaks to the s'proc and it was back down to 250ms. Without redeploying the site.

  3. We do NOT use drag / drop page coding. All of our devs know and understand the things that vastly improve browser rendering performance. Such as using table-layout:fixed; Basically we solve the math problems ahead of time. They are fluent in css and know the difference from one DOCTYPE to the next.

  4. We do NOT use Session. Most of our apps are load balanced and using session would require an extra 2 database calls per page (save / retrieve). I've yet to run into a situation where that is necessary.

  5. We DO use css and javascript compressors. Every byte counts at large scales (thousands of users or more).

  6. We DO follow the KISS rules. For example, unless there is a very damn good reason we do NOT use web services; instead we go the REST route for any ajax. And I've never seen a good reason for WCF. Most developers I know who have used it end up gutting most of the "security" features just to get it to work reliably.

  7. We take the time to tweak IIS for performance. Little things like making sure pages are properly compressed. Also images, style sheets, and javascript are properly marked for client side caching. YSlow (Firefox plugin) is your friend here, anything less than an A rating in one of their categories means you need to evaluate it.

  8. All third party libraries are evaluated on several things: do they actually do what we want; how much larger do they make my page; is there a better way? One prime example is DevExpress. At least last year (not sure of any changes in the last 8 months) their client portion resulted in 1MB of javascript being downloaded. Again, not worth it.

  9. We tend to use very few images. It's amazing what you can do to style a button using a little bit of css.

  10. We also tend to minimize the use of javascript. We only use it where we get the most bang for the buck. Yes, we do have some pages that do drag/drop; and others that use Ajax. However, most users just don't care, so those things don't need to be everywhere.

Chris Lively
Would you be able to give some generic ideas on what you do to increase speed on the code side?
rockinthesixstring
@rockin: Take a look at Google's Page Speed plugin, and Yahoo's YSlow plugin, both for Firefox. And their corresponding blogs posts. Very helpful information on making the client-side experience as fast as possible.
DisgruntledGoat
@Chris - Thanks for the detailed information. I'm sure many users will benefit from this. I just tested my CMS against ySlow and got a "C" rating... I guess I still have some work to do.
rockinthesixstring
Great info, thank you.
sh-beta
+1  A: 

A side point that may or may not be interesting. The best way to develop is to build the site first, sure make sure you have a decent design, but don't spend to much time on page load and speed issues. Once the site is done, then go back and fix the slow spots. This will focus your work on the spot that actually matter and you will be much more efficient in making fast websites. With more experience you will know before you start what to do, but for a beginner this is what I recommend.

Hogan