views:

42

answers:

2

Hi. I'm looking for a program with really quite specific functionality, which hopefully exists so I don't have to implement it myself. I can best describe this hypothetical program as a filesystem-based cache of compressed copies of documents in the working directory. I've made a mock-up of what I would expect this program to do:

james@pc:~/htdocs/$ tree -a
.
|-- image.png
`-- index.html

0 directories, 2 files

james@pc:~/htdocs/$ zipcache init
Initialized cache in ./.zipcache/
james@pc:~/htdocs/$ tree -a
.
|-- .zipcache
|   |-- gzip
|   |   `-- index.html.gz
|   `-- lzma
|       `-- index.html.lzma
|-- image.png
`-- index.html

1 directory, 3 files
james@pc:~/htdocs/$ zipcache gzip index.html
... zipcache emits gzipped copy of index.html on stdout by cat-ing ./.zipcache/gzip/index.html.gz
james@pc:~/htdocs/$ zipcache lzma index.html
... zipcache emits lzma'd copy of index.html on stdout by cat-ing ./.zipcache/lzma/index.html.gz
james@pc:~/htdocs/$ zipcache lzma image.png
... zipcache generates error signifying cache miss (it's intelligent enough to know that PNG shouldn't be further zipped) ...

My ultimate concern is caching compressed copies of static files that are repeatedly transferred over HTTP with Content-encoding enabled. I have no desire to calculate compression every time a file is requested.

I would still appreciate pointing in the right direction if something vaguely similar to the above exists -- my Google searching has been quite unsuccessful (perhaps there is terminology for the above functionality that I don't know about).

A: 

I guess you can write quite simple PHP script for such caching. I am not sure such thing exists already

FractalizeR
+1  A: 

[Admittedly, this is answering the "one up from this" question instead]

It seems like doing this yourself is a bad idea and you should let the web server do this.

I'm guessing you're using apache on a unix variant, but for completeness:

James Manning