We are looking at adjusting our web pages so we split our calls for static data across sub-domains. In order to do this we must:
- Always serve the same content from the same sub-domain so it remains cached
- Try and serve roughly the same amount of content from each of the sub-domains
- Try to do this in an automatic way on a per-page basis, i.e. avoid hand-coding every image and having to maintain that
- Come up with a solution that will work in our dev/test/production environments
- Do not impact on page processing times on the server
We are creating our HTML pages using a J2EE application server and I am trying to find the best approach in order to implement this for all our web pages.
As far as I can see it, if we do this while rendering the HTML page, we will need to have a list of images with sizes stored somewhere that we can reference while we build the page. If an image is to be written out on the HTML page we will need to determine how many other images are being served from that sub-domain for this user at that point in page processing. Then we assign this image to a sub-domain and somehow remember that for the user session (or longer).
Or am I over-complicating this? Would it be easier to maintain a list of images on each page on the site, and (offline) determine which sub-domain each image lives on, and just use that consistently for all users?
I was thinking there might be a way to do this in the browser using JavaScript but I think the overhead for each page render might be a little high.
I know the topic is a bit fuzzy but if anyone has implemented this I'd love to hear of your experience.
Thanks!