views:

151

answers:

1

It's easy to use the pre-compression module to look for a pre-compressed .gz version of a page and serve it to browsers that accept gzip to avoid the overhead of on-the-fly compression, but what I would like to do is eliminate the uncompressed version from disk and store only the compressed version, which would obviously be served the same way, but then if a user-agent that does not support gzip requests the page I would like for nginx to uncompress is on the fly before transmitting it.

Has anyone done this or are there other high performance web servers that provide this functionality?

A: 

One option is to have a fall-back upstream server to decompress the file, eg:

gzip_static on;
...
upstream decompresser {
    server localhost:8080; // script which will decompress the file
}

location / {
    try_files $uri @decompress;
}

location @decompress {
    proxy_pass http://decompresser;
}

Another option would be to use the embedded perl module as the fall-back rather than the upstream, however this can cause nginx to block and if the operation lasts a while could decrease performance.

With the upstream model you may be able to take advantage of nginx's XSendfile module by using the system's default gzip program to decompress to a file in the /tmp directory. This could save on decompression overhead per-request by allowing the file to hang around for a short while.

digitala