views:

134

answers:

4

What are the tradeoffs of the different compression algorithms?

The purpose is backup, transfer & restore. I don't care about popularity, as long as a mature enough tool exists for unix. I care about

  • time
  • cpu
  • memory
  • compression level

the algorithms I am considering are

  • zip
  • bzip
  • gzip
  • tar
  • others?
+2  A: 

Tar is not a compression algorithm per se.

You may use zip/gzip when time for compression/decompression is the most important issue.

You may use bzip when you need a better compression rate.

You may use LZMA when even bigger compression rate needed, but CPU time bigger.

Have a look here.

subtenante
This comparison site is outdated, because there are lot of changes in lzma and 7zip since 2005.
bill
A: 

It would be very simple to create a simple testbed for those cases.

Write a script that uses each in turn on a set of files that is representative of those you wish to comporess, and measure the time/cpu/memory usage/compression ratio acheived.

Rerun them a statistically significant number of times, and you'll have your answer.

Visage
+2  A: 

The best way is to look at compression benchmark sites:

Maximumcompression

Compressionratings

bill
Useful links, thanks.
Nifle
+1  A: 

It usually depends on your input data but I've never found anything that gives me better general compression than 7zip (http://www.7-zip.org).

paxdiablo