tags:

views:

41

answers:

3

Hello.

I'm developing a website, I'm using gzip.exe to pre-compress css and js files (it's only 1 css file that went from 4.53 KB to 1.50 KB, and a js file containing the jquery and some scripts that went from 72.8 KB to 24.7 KB)

these files are style.gz and js.gz and served as static files.

The problem is that they don't work with Safari (v5.0.2) not the css neither the js.

the work fine for these browsers:

Firefox 3.6.10 / Google Chrome 6.0.4... / IE 8 x64 / Flock 3.0.6 / Maxthon 2.5.15... / Avant Browser 2010

All of them work fine except Safari

I'm using Windows 7 x64

A: 

There is a way around this issue. Basically, you need to use .jgz instead of .gz as the extension for JavaScript files:

Fix for .gz and Safari

Michael Goldshteyn
Thank you Michael. For the jgz it works now but the css doesn't work yet. even with .jgz or .cssgz
mohamed87
Did you try .css.jgz? Hopefully, that will resolve you remaining issue.
Michael Goldshteyn
+2  A: 

You should get the server to gzip them - this can be done in the htaccess file and works just fine.

rob_james
Yes I already know that but I want to pre-compress them as they are static and may consume some server resources for shared web hosting.
mohamed87
Fair enough - but i suspect you wouldn't notice the difference.
rob_james
Jumping back in: I'm fairly sure i read somewhere that once the "initial gziping" has been done, it is saved in memory on the server. So future requests come from there rather than being regenerated every time. Coupled with heavy caching and long expire headers - you really shouldn't notice. Ps. you should use "deflate" rather than gzip, if your server allows it as it's more efficient.
rob_james
+1  A: 

+1 What rob said.

What you're doing is a hack that is not supposed to work, so you can't really complain when it fails. Serving a gzip resource is completely different to serving a resource with a different type using a Content-Encoding header to compress it on the wire.

Serving pre-gzipped files will obviously also fail for user agents that don't understand gzip. The HTTP standard provides a mechanism for negotiating this: Accept-Encoding and Content-Encoding. Unless you re-implement that mechanism in a complete, standard-compliant way (and the article in Michael's link doesn't begin to get that right), you're blocking clients.

This is all unnecessary. Gzip is fast and servers typically cache the compressed version of static content so you gain nothing by trying to second-guess it. HTTP compression is part of the web server's core competency, let the web server do its job and leave gzip alone.

bobince
Thanks for your reply bobince. You may have right but don't you think that compressing large files of 100 KB might consume some resources and might be noticeable especially on a shared hosting plan? and are you sure that the compressed js or css file will be cached and served over and over again without being recompressed every time a page was requested?
mohamed87