views:

154

answers:

5

I have a number data-driven web based applications that serve both internal and public users and would like to gauge how fast you would expect a page to be created (in milliseconds) in order to maintain user satisfaction and scalability.

So, how fast does a page have be created to maintain a fast site?

The sites are developed in ASP classic, with a SQL Server backend generating XML recordsets that I render using XSLT. Not the most efficient technique and pages take between 7ms to 120ms to create (i.e. Timer interval between first line of code and the 'Response.Write') depending on the complexity of the page. Slower pages are due to the database running bigger and more complex queries. Even if I re-wrote all the ASP classic to ASP.NET there will not be any significant improvement to the overall page render speed.

I've often heard Jeff say he wants SO to be the fastest site, and his blogs have discussed optimisation of his code and database but how far do you have to go in optimising your the code? Is shaving off milliseconds by using StringBuffer instead of String + String a good use of my time?

[Clarification]

At what point do you start to think "This page is taking too long to create?". Is it over 20ms, over 200ms or is it OK for a page to take over a second to build? What are your "target times?"

A: 

If you can shave off milliseconds by just changing one thing, go for it!

You might want to have a look into caching database requests as well.

Charlie Somerville
+2  A: 

Users doesn't care about how fast you prepare your data, they only care about the actual loading-time of the page.

If you have a lot of overhead in rendering, your users will experience your site to be slow. Concerning classic ASP, string concatenation is considered very bad practice since it's gonna be really slow when you hit the critical string-length where it will start to be a burden to the server.

Using an array (jscript) or a .NET StringBuffer can improve the rendering-time significantly. Aswell as unloading unecesary CPU-usage, that would allow your server to handle more traffic, I would say those kind of obvious optimization is very worth doing.

jishi
Agree that avoiding the very well documented "string concatenation" issues with vb/asp is well worth it and is the reason I use ADO's rs.Save and XSLT to all the heavy lifting.
Guy
+4  A: 

This depends entirely on your audience and targets - I've worked on apps with a target 'onload' event at <4secs, and on apps where the time on server is expected to be <1ms. It can go either way - but whatever you do you need to be aware that whatever performance optimisations you do at the server-side level are likely to be dwarfed by both network performance, still the major bottleneck with the web, and perceptive load times.

Yahoo has some excellent guidelines for general website performance, especially on the perceptive load area.

Hopefully you're already smart enough to be caching what you can and doing the little things like avoiding massive chains of Response.Writes...

annakata
<snipped from the Yahoo document>Flush the Buffer EarlyWhen users request a page, it can take anywhere from 200 to 500ms for the backend server to stitch together the HTML page. During this time, the browser is idle as it waits for the data to arrive. A good place to consider flushing is right after the HEAD because the HTML for the head is usually easier to produce and it allows you to include any CSS and JavaScript files for the browser to start fetching in parallel while the backend is still processing.
Guy
(sorry about formatting above) - The Yahoo document is very good, and it does seem to be the case that you can improve the overall perception of performance by spending quality time on the "networking" aspect of the site. The actual HTML page render time is relatively trivial...
Guy
+2  A: 

A very interesting screencast on this topic van be found here: link text .

Although it is made by a Rails guy, it is perfectly applicable to other frameworks.

Tarscher
One of the reasons I raised this question was all the flack people give to "slow" frameworks like asp classic, Rails, Ruby etc. is to discuss "how fast does it need to be?"
Guy
A: 

One factor that affects user satisfaction to server response time is how often he is requesting a new page. If you're presenting (say) a page with lots of information that the user is going to spend some time to read, a longer "rendering" time is okay. In contrast, if the person is quickly navigating through pages, he will want a near instantaneous response.

For example, if you're on a news site, you probably will be okay if it takes a full second or two for the next page, since you're going to be spending 30 seconds to read it.

On the other hand, if you're browsing through an interactive map, you probably want the response to be less than a second.

Toybuilder