views:

141

answers:

8

The browser is probably closer to be CPU-constraint than network constraint, right? We have a very heavy ajax application, so from the brower's perspective (not the server) perhaps it should be better not to use compression.

What do you think

+1  A: 

Well, in general cpu cycles are pretty cheap, compared to network speed. And the decompression is not going to take too much cpu cycles unless you're trying to uncompress MBs of data.

But I guess in the end it depends on the people who are going to use the website. If you are sure that they have really fast internet connections, then maybe you don't have to use compression. On the other hand, you can always be sure that they have pretty decent CPUs.

trex279
+5  A: 

It entirely depends on the browser and device. On a normal laptop or desktop at home I'd expect the network to be the limiting factor. At a well-connected office it may be CPU if there's any real bottleneck at all.

Then you've got netbooks, potentially over 3G, and then mobile phones...

Personally I'd go for compression - I think it's more likely to be a win than a bottleneck.

Jon Skeet
+2  A: 

My clients tend to be small offices sharing an internet connection and bandwidth is a major consideration. Our application serves up rather large pages, so compression made a huge difference.

Depends on bandwidth, number of users, size of pages.

Compression and decompression are pretty optimised, and you can control the compression level.

MikeW
+1  A: 

The most common reason for not using compression is the CPU load it puts on the server (for dynamic pages). Usually that is the far bigger concern, so I assume the CPU load on an average desktop PC is negligible (unless you want to use bz2).

Joachim Sauer
A: 

If the compression puts a heavy load on the server and/or the compression ratio is poor (e.g. a 20K page gets compressed to 19K), then I would consider switching it off. Unless your users have a very fast, low latency connection to the server, the browser will almost certainly be network-bound.

I would also consider HTTP caching techniques - they are quite efficient, even more so with AJAX. This saves you bandwidth, and CPU cycles both on server and client side - no need to re-request, re-generate, re-transfer, re-parse and re-save what you already have.

Piskvor
A: 

See Yahoo's best practices for making web pages fast. Especially Gzip Components conclusion:

Gzipping as many file types as possible is an easy way to reduce page weight and accelerate the user experience.

gimel
A: 

Decompression is a very fast process compared to compression. Even a weak CPU (like one in a mobile device) can effectively decompress GZIP (ZIP, Deflate, and so on) within almost no time at all. Compressing the data is a much harder job. By using compression, you are increasing the server load... and the level of increase is not always negligible. Usually it is a trade off between bandwidth usage and CPU usage on the server side. For the client it usually plays no big role, unless the Internet access is very slow (e.g. a mobile phone connection). However, only text/html/css/js can be effectively compressed, most other data you'll find on web pages (like tons of images for example) cannot be compressed at all; so if we are talking about pages where there are 8 kb web page data and the page itself loads 200+ kb image data, forget about compression, it will buy you virtually nothing.

Mecki
A: 

Any script or content the browser downloads will only need decompressing as it comes down the wire, and from then on it is already in a decompressed form (the only exception is if you are doing something strange with HTTP headers, like Cache-Control: no-store).

It will decompress faster than it can download - the bottleneck is very unlikely to be the decompression.

Even a slightly older sluggish computer could probably decompress GZIP at more than 50 MB per second, which would need a network speed of at least 410Mbps (that's 410,000kbps) to saturate.

I'd be more concerned about buggy browsers like IE6 (pre-SP2) which ask for compressed content but cannot deal with it in some situations, like when you compress CSS.

thomasrutter