views:

116

answers:

3

We use large background images (hi-res photos, up to 700 KB) for our page design. It's part of the experience of the site that as you browse around, you see different images.

At the moment a different (random) image is loaded on each page request, from a pool of ~15 images, which could grow over time.

I'm looking for a sane way to optimize this:

  1. To avoid the user having to download a big image file on every page view
  2. To reduce load on the server (is this an issue, will the server keep the images in memory?)

The ideas I have so far include:

  • A timer which loads a different image at set intervals
  • Progressively loading other images in the background with ajax
  • Associating images with specific content (pages, tags)

The question is, how to keep it feeling somewhat random, while minimizing page load times and server hit?

+1  A: 

I usually avoid sites with huge images, I am very impatient. I would rethink your design.

Mike Curry
No worries, this is a CRUD interface for a business databases...the users will have no choice.
Paul
If it is an internal site (internal LAN), the images should load up instantaneously. If that is the case, I wouldn't worry about preloading.
Jon
The design is non-negotiable, and it's not a CRUD interface!?
meleyal
+1  A: 

As a first step you should make sure, that the images can be properly cached:

  • use sane urls (no session id's etc)
  • set appropriate http headers ETag
ebo
+1  A: 

Firstly, hearing that the background-images alone are 700kb astounds me. In addition to the content ON screen...that is a pretty heavy site.

For starters, I would try to use image compression tools. Two tools come to mind Imagemagick and PNGCrush. PNGCrush is excellent in reducing all the extraneous metadata attached to photos, without compromising photo quality.

I only recommend this as compressing the images will assist you in enabling the user to download a smaller quantity of content, which means quicker load times, which...at the end of the day...is what users want.

I would also cache the images, such that when a user re-visits the site, the image is already cached on their end. This minimises the HTTP requests that are made each time a user visits your site. An example of where this technique is used on a commercial site is www.reactive.com. If you look the /js/headerImages.js file, they make use of image caching. Funnily enough, you will find the same src code at: http://javascript.internet.com/miscellaneous/random-image.html

Considering that you have mentioned that images are randomly loaded, I am assuming you are using a Javascript library such as jQuery to create the effect. If you are, you can minimize page load times by using a CDN as opposed to referencing to a local copy of the jQuery lib which is stored on your server. I have performed performance testing on a site I made for a client, and over an average of 20 hits, saved 1.6 seconds through this technique!

Hope that helps for now :)

Lycana
I've had the experience that loading external scripts slowed down the page waiting for the server to respond, though I guess it depends where it's hosted
meleyal