views:

112

answers:

4

I want to improve the page load times of a web site. It is a web application (think something like web mail) with relatively few users that spend long periods using the site.

As almost all page requests are from users that have already used the site, images, css and external javascript resources will have been cached by the browser during previous requests. As the browser doesn't need to request those resources again, am I right that the following performance tips won't give me any improvements in response speed?:

  • CSS sprites
  • parallel downloading of images using alternative domain names
  • placement of Javascript includes at the foot of the page

Does anyone have any performance tips that are likely to improve response times for this kind of web site.

+2  A: 

This is a good resource for speeding up web applications: Yahoo: Best Practices for Speeding Up Your Web Site

In order to benefit from client-side caching, make sure to send 'Expires' and 'Cache-Control' headers as explained by these practices.

Ferdinand Beyer
A: 

I think you need to do a bit of profiling. First determine if all of the browsers you wish to support will actually cache what you want them to cache (examine your network traffic during a request, if need be). Some browsers have funny caching rules. If there are any problems here, fix them!

Next, check the content that is being retrieved each request. If your HTML/CSS/Javascript is unnecessarily verbose then it's more data to send, and more data for the client to process.

Look at timing on the server side, network side, and client side. See how long the server takes to serve static files, and how long scripts run for. See if the round trip time between servers and clients is reasonably low (in some cases you might want to relocate the server). Also make sure your pages aren't slow to render on the client side, whether it be due to slow scripts, excessive content, or complex rendering.

Artelius
A: 

To truly know what is going on with your clients, you want to use Wireshark (or an equivalent) to watch what transfers occur. This will tell you precisely what that client is fetching on the first and on subsequent requests, and which requests are taking a long time to receive for the client.

Eddie
+1  A: 

What you are saying is that the website is already in the client-cache so after they take the one-time hit of downloading, the subsequent load times will be negligible and how do you optimize there?

In that case, there are a few things I can suggest. First, reduce the number of HTTP requests to the server. This means:

  1. If the client wants to get to Clothing > Men > Shoes, use a fly-out menu to help them get there faster (unless of course you want to force a scenic route).
  2. Load everything in the same HTTP request (one-time hit for recurring users) and then user JS to hide/display info on demand.

Second, you can use AJAX so the client doesn't have to repaint the whole screen each time a response comes in from the server.

Third, you can set high cache-expiration times so the client doesn't keep checking for updates any time soon. This will further optimize the already existing caching layer.

BTW, even if images are already cached, I still think hosting them on different servers will result in better performance. The reason is that the browser limits the number of parallel downloads from one domain and it still needs to check if the images are current or expired (using HEAD requests). So hosting them on multiple domains will make that part of the task faster. Could be wrong here though and it's possible the browser handles this particular scenario differently.

aleemb