tags:

views:

135

answers:

3

I've create a plain and siple backup script that only backs up certain files and folders.

tar -zcf $DIRECTORY/var.www.tar.gz /var/www
tar -zcf $DIRECTORY/development.tar.gz /development
tar -zcf $DIRECTORY/home.tar.gz /home

Now this script runs for about 30mins then gives me the following error

gzip: stdout: File too large

Any other solutions that I can use to backup my files using shell scripting or a way to solve this error? I'm grateful for any help.

+2  A: 

Can the file system you are backing up to support large files?

Specifically, FAT32 has a limit of ~4GB in a single file, and other filesystems have similar limits.

If your backup is running for 30 minutes, the file could easily be getting that sort of size.

Adrian
Likewise, do your versions of `tar` and `gzip` support large files?
Rob Kennedy
He mentioned NTFS and ext4 in his response to WhirlWind's comment.
Mike Pelley
A: 

Use a different compression utility, like compress or bzip2

frankc
+2  A: 

File too large is a error message from your libc: The output has exceeded the file size limit of your filesystem.

So this is not a gzip issue.

Options: Use another Filesystem or use split:

tar czf - www|split -b 1073741824 - www-backup.tar.

creates the backup.

Restore it from multiple parts:

cat www-backup.tar.*|gunzip -c |tar xvf -
Jürgen Hötzel
He mentions that the filesystems are NTFS and EXT 4, so I'm not sure why libc would push back. Nonetheless, sounds right so +1.
Mike Pelley