views:

67

answers:

2

I am generating relatively large files using Perl. The files I am generating are of two kinds:

  1. Table files, i.e. textual files I print line by line (row by row), which contain mainly numbers. A typical line looks like:

    126891 126991 14545 12

  2. Serialized objects I create then store into a file using Storable::nstore. These objects usually contain some large hash with numeric values. The values in the object might have been packed to save on space (and the object unpacks each value before using it).

Currently I'm usually doing the following:

use IO::Compress::Gzip qw(gzip $GzipError);

# create normal, uncompressed file ($out_file)
# ...

# compress file using gzip
my $gz_out_file = "$out_file.gz";
gzip $out_file => $gz_out_file or die "gzip failed: $GzipError";

# delete uncompressed file
unlink($out_file) or die "can't unlink file $out_file: $!";

This is quite inefficient since I first write the large file to disk, then gzip read it again and compresses it. So my questions are as following:

  1. Can I create a compressed file without first writing a file to disk? Is it possible to create a compressed file sequentially, i.e. printing line-by-line like in scenario (1) described earlier?

  2. Does Gzip sounds like an appropriate choice? aRe there any other recommended compressors for the kind of data I have described?

  3. Does it make sense to pack values in an object that will later be stored and compressed anyway?

My considerations are mainly saving on disk space and allowing fast decompression later on.

+7  A: 
  1. You can use IO::Zlib or PerlIO::gzip to tie a file handle to compress on the fly.

  2. As for what compressors are appropriate, just try several and see how they do on your data. Also keep an eye on how much CPU/memory they use for compression and decompression.

  3. Again, test to see how much pack helps with your data, and how much it affects your performance. In some cases, it may be helpful. In others, it may not. It really depends on your data.

bdonlan
Just for curiosity, is there PerlIO layer with same functionality?
Hynek -Pichi- Vychodil
belive theres a PerlIO layer for gzip at least ..
Øyvind Skaar
Looks like there is: http://search.cpan.org/~nwclark/PerlIO-gzip-0.18/gzip.pm
bdonlan
+2  A: 

You can also open() a filehandle to a scalar instead of a real file, and use this filehandle with IO::Compress::Gzip. Haven't actually tried it, but it should work. I use something similar with Net::FTP to avoid creating files on disk.

Since v5.8.0, Perl has built using PerlIO by default. Unless you've changed this (i.e., Configure -Uuseperlio), you can open filehandles directly to Perl scalars via:

open($fh, '>', \$variable) || ..

from open()

Øyvind Skaar
+1 Thanks for the nice idea!
David B