views:

87

answers:

3

Say I don't have mod_deflate compiled into apache, and I don't feel like recompiling right now. What are the downsides to a manual approach, e.g. something like:

AddEncoding x-gzip .gz
RewriteCond %{HTTP_ACCEPT_ENCODING} gzip
RewriteRule ^/css/styles.css$ /css/styles.css.gz

(Note: I'm aware that the specifics of that RewriteCond need to be tweaked slightly)

+1  A: 

There doesn't seem to be a big performance difference between the manual and automatic approaches. I did some apache-bench runs with automatic and manual compression and both times were within 4% of each other.

The obvious downside is that you'll have to manually compress the CSS files before deploying. The other thing you might want to make very sure is that you've got the configurations right. I couldn't get wget to auto-decode the css when I tried the manual approach and ab reports also listed the compressed data size instead of uncompressed ones as with automatic compression.

codie
The manual compression might not be so much an issue - in this case, I'm dealing with files that rarely change, and there's a /possibility/ that the compression could be automated via a server-side script.The info from wget is very useful, though - thanks for the research. I guess I'm going to have to test this myself if I'm actually going to use it, but great info on the performance difference - many thanks.
Bobby Jack
+1  A: 

Another alternative would be to forward everything to a PHP script, which gzips and caches everything on the fly. On every request, it would compare timestamps with the cached version and return that if it's newer than the source file. With PHP, you can also overwrite the HTTP Headers, so it is treated properly as if it was GZIPed by Apache itself.

Something like this might do the job for you:

.htaccess

RewriteEngine On
RewriteRule ^(css/styles.css)$ cache.php?file=$1 [L]

cache.php:

<?php
// Convert path to a local file path (may need to be tweaked)
cache($_GET['file']);

// Return cached or raw file (autodetect)
function cache($file)
{
  // Regenerate cache if the source file is newer
  if (!is_file($file.'.gz') or filemtime($file.'.gz') < filemtime($file)) {
    write_cache($file);
  }

  // If the client supports GZIP, send compressed data
  if (!empty($_SERVER['HTTP_ACCEPT_ENCODING']) and strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip') !== false) {
    header('Content-Encoding: gzip');
    readfile($file.'.gz');
  } else { // Fallback to static file
    readfile($file);
  }
  exit;
}

// Saved GZIPed version of the file
function write_cache($file)
{
  copy($file, 'compress.zlib://'.$file.'.gz');
}

You will need write permissions for apache to generate the cached versions. You can modify the script slightly to store cached files in a different place.

This hasn't been extensively tested and it might need to be modified slightly for your needs, but the idea is all there and should be enough to get you started.

Ivan Peevski
You probably want to flip that greater-than sign into a less-than sign. Other than that, looks good.
Alex
Thanks, you are right of course ;)
Ivan Peevski
Yeah, my reference to a server-side script in an earlier comment was me thinking along those lines. Great use of 'compress.zlib://', though; I hadn't come across that before.
Bobby Jack
Thanks :) hopefully it works for you.
Ivan Peevski
A: 

You could also use mod_ext_filter and pipe things through gzip. In fact, it's one of the examples:

# mod_ext_filter directive to define the external filter
ExtFilterDefine gzip mode=output cmd=/bin/gzip

<Location /gzipped>
# core directive to cause the gzip filter to be
# run on output
SetOutputFilter gzip

# mod_header directive to add
# "Content-Encoding: gzip" header field
Header set Content-Encoding gzip
</Location>

The advantage of this is that it's really, really easy… The disadvantage is that there will be an additional fork() and exec() on each request, which will obviously have a small impact on performance.

David Wolever