views:

99

answers:

5

Hi there!

I'm working on web app (Rails 3 based). And I really don't like the time it takes to generate the page - depending on the displayed data it takes up to 2.5 and even 4 seconds.

So I just was wondering what is the average reasonable time for generating page in your apps. Saying you check the generation time, e.g. it's 750ms and think "Ok, that should be fine even without caching". Or when you see 1.5sec you think "Oh my God, the user won't wait so long and leave the site"

+1  A: 

There's no 'right' answer to this - the faster the better. Personally I normally aim for < 200ms, although I know from experience that it can be quite difficult to achieve this in Rails on anything but simple apps. Try and figure out where your bottlenecks are and cache what you can.

Edit: There seems to be some confusion between page generation time and page render time. Obviously a quick page render is the goal, and on most sites doing things like reducing HTTP requests, gzipping CSS/JS are where you can get most of your quick wins. But if the page itself can take 4-5 seconds to generate, then you're probably right that your app is where you should start.

Tim Fountain
I think one of the most important time is the user **perceived** time. Lots of optimizations currently applied on websites don't take the code into account, but rather focus on asynchronous loading of javascript, and parallel downloading of ressources.
samy
A: 

Measure your apdex score and see how it is performing. That will give you a rough indiciation. From there, you can decide how you want to increase performance.

It also depends on what your site is; an system application for a business or software as a service (SaaS)? If it's a system application, the users are forced to use it to performance can be negotiated. If it is a SaaS, then the higher your apdex score, the more chance you have of losing your user's interest.

There are a few gems out there that measure performance and report on what your apdex is.

Here's a little more info: http://apdex.org/blog/?p=630

Coderama
+1  A: 

It depends on whether nothing is displayed for 2.5-4 seconds, or that the user already sees (a part of) the page from the start, and it finishes loading completely after 2.5-4 seconds. In that case the user doesn't experience a 2.5-4 second load. Take the http://www.nytimes.com/ website; I see most of it right away, but according to the Web Inspector it takes 1.94 seconds for it to be loaded completely.

And keep in mind that the speed will also depend on the browser, computer, internet connection. What's fast for you might be slower for others.

Alec
+3  A: 

There's a huge amount of research data regarding the time from query to rendering and user's experience. I'd recommend reading this useit.com article. After all Google integrated page speed in its results for a reason ;)

The 3 response-time limits are the same today as when I wrote about them in 1993 (based on 40-year-old research by human factors pioneers):

  • 0.1 seconds gives the feeling of instantaneous response — that is, the outcome feels like it was caused by the user, not the computer. This level of responsiveness is essential to support the feeling of direct manipulation (direct manipulation is one of the key GUI techniques to increase user engagement and control — for more about it, see our Principles of Interface Design seminar).
  • 1 second keeps the user's flow of thought seamless. Users can sense a delay, and thus know the computer is generating the outcome, but they still feel in control of the overall experience and that they're moving freely rather than waiting on the computer. This degree of responsiveness is needed for good navigation.
  • 10 seconds keeps the user's attention. From 1–10 seconds, users definitely feel at the mercy of the computer and wish it was faster, but they can handle it. After 10 seconds, they start thinking about other things, making it harder to get their brains back on track once the computer finally does respond.

A 10-second delay will often make users leave a site immediately. And even if they stay, it's harder for them to understand what's going on, making it less likely that they'll succeed in any difficult tasks.

As a rule of thumb, think that you always should aim for a balance of optimization time vs time gained. Don't spend days optimizing the hell out of one routine when your images aren't compressed correctly, or your scripts/css not combined. Yes, faster is better, but a 90% gain in generating the page by setting up a smart cache beats a 10% gain after one week tweaking the algorithm.

Also don't look too much into the first-render-time when the framework has to load everything, but use stress-testing, cached or not, to simulate various situations.

Now, some data; some of the latest sites i worked on used DotNetNuke, a huge open-source CMS, and Asp.Net MVC where you nearer to the metal. Average page time with average db queries was 600-700 milliseconds for DotNetNuke. For Asp.net MVC, it's 70-100 milliseconds... Users really like the second one :)

samy
A: 

My personal rule - no page should take more than 0.05 seconds, or you are in troubles.

As long as you write proper code, you don't need to spend much time on optimization to stay under 0.05.

If you stick to giant frameworks, then you are out of luck.

BarsMonster
wow, you got sites that are that fast, kudos! You must cache the hell out of your db requests, i guess. What are you working on, a baremetal framework?
samy
Out of curiosity, can you personally tell the difference between .1 seconds and .05 seconds? If so, does the difference really bother you? I am asking only because .05 seems like an incredible arbitrary number.
sosborn
Yes it does matter when you would start to get real load. It's not baremetal, it's just lightweight, and memory caches.
BarsMonster