views:

37

answers:

2

Hi

I am interested to know about the possibilities of reducing http requests on servers by sending different kind of contents in a single compressed files and later uncompress on client's browser and place the stuff(images,css,js) where it should be.

I read somewhere that firefox is working on plan to give such features in future releases but it has not been done yet plus it would not be a standard version.

Will you guys suggest any solution for this?can Flash be used to uncompress compressed files on client side for later use?

Thanks

+2  A: 

I had to read your question a few times before I got what you were asking. It sounds like you want to basically combine all the elements of your site into a single downloadable file.

I'm fairly confident in saying I don't believe this is possible or desirable.

Firstly, you state that you've heard that Firefox may be supporting this. I haven't heard about that, but even if they do, how will you be able to use the feature while still supporting other browsers?

But even if you can do it, you've tagged this as 'performance-tuning', on the grounds that you'll be saving a few http requests. But in your effort to save http requests to speed things up, you need to be cautious that you don't actually end up slowing things down.

Combining all the files may cut you down to one http request, but your site may then load slower as the whole thing would need to load before any of it would be ready for display (as opposed to a normal page load where your page load may take time but at least some of it may be ready for display quite quickly).

What you can do right now, and which will be useful for reducing http requests, is combine your stylesheets into a single CSS, your scripts into a single JS file, and groups of related images into single image files (google CSS Sprites for more info on this technique).

Even then, you need to be careful about which files you combine - the point of the exersise is to reduce http requests so you need to take advantage caching, or you'll end up making things worse rather than better. Browsers can only cache files that are the same over multiple pages, so you should only combine the files that won't change between page loads. So for example, only combine the Javascript files which are in use across all the pages on your site.

My final comment would be to re-iterate what I've already said: Be cautious about over-optimising to the point that you actually end up slowing things down.

Spudley
@Spudley, I don't agree with the impossibility and desirability you defend. We do it on our web app, and don't regret it.
Mic
@Mic: it would be interesting if you elaborate your implementation with outer world
Volatil3
Have a look at the demo at http://beebole-apps.com?demo
Mic
+1  A: 

We did more or less what you describe in our web an are extremely happy of the response time.

The original files are all separated (HTML, CSS, JS, images) and we develop on them.
Then when moving to production we have a shell script that:

  • use YUI compressor to compress CSS and JS
  • all images are read and converted to data:image/png;base64,...
  • all blank spaces and comments are removed from the HTML
  • all these resources are put inline in the HTML

The page is ~300kb and usually cached.
The server gzip it, the real size travelling the network is then lower.
We don't use any additional compression.

And then there is a second call to get the data(JSON for us) and start rendering it client side.

Mic
And you are using DOM methods to create nodes for HTML rendering or setting innerHTML?
Volatil3
We use http://beebole.com/pure to render all the JSON in HTML. PURE uses `innerHTML`. I tried the DOM first but it was very slow, while `innerHTML` is fast on all browsers. This is particularly striking on mobile browsers.
Mic
and base64 encoding is done at runtime or all images have been converted in encoded form?
Volatil3
Not at runtime. The script reads the binary file and inject it inside the CSS url(...)
Mic