tags:

views:

2359

answers:

3

I tar a directory full of JPEG images: tar cvfz myarchive.tar.gz mydirectory

When I untar the archive: tar xvfz myarchive.tar.gz I get an error

tar: Unexpected EOF in archive

Looking at the output, it fails in the middle of one particular JPEG image.

What am I doing wrong?

A: 

Interesting. I have a few questions which may point out the problem.

1/ Are you untarring on the same platform as you're tarring on? They may be different versions of tar (e.g., GNU and old-unix)? If they're different, can you untar on the same box you tarred on?

2/ What happens when you simply gunzip myarchive.tar.gz? Does that work? Maybe your file is being corrupted/truncated. I'm assuming you would notice if the compression generated errors, yes?

Based on the GNU tar source, it will only print that message if find_next_block() returns 0 prematurely which is usually caused by truncated archive.

paxdiablo
Thanks for your quick answer1/ same machine (for testing purposes): Ubuntu. Eventually, I want to untar on Mac OS X (the error is the same there...)2/ gunzip works fineNo error when the archive is created...
Cyrille
Okay, remove the offending JPEG temporarily and see what happens. This will let you know if it's that specific JPEG or tar itself. Also try without the zip option.
paxdiablo
Ah, you put me on the right track here: the tar archive is truncated. I didn't see any errors in my syslog (the archive is created by a cron job), but something must go wrong. There is plenty of space on the disk (when I create the archive "by hand", it is not truncated)... I'll dig deeper now, thanks for your replies.
Cyrille
paxdiablo
Hi there, the tar file is about 8M. I added logging to the cron job and... it solved the problem! My guess is that adding logging slows down the archiving and makes it work... For the record, I have only 243 files in the archive and ulimit -n gives me 1024, so it shouldn't be a problem. Anyway, thank you very much for your help, very appreciated.
Cyrille
@Cyrille, ulimit -n (#files) shouldn't matter since it's unlikely tar would keep them all open at once anyway. I was thinking more of ulimit (ulimit -f is the default) which shows the maximum allowable file *size*. I'd also be uncomfortable with a Heisenbug so I'm willing to work further to get it sorted out (it's hard to imagine that adding logging to the script would affect the tar command at all!) but, if you're happy to let it go, that's fine too. Let me know. Cheers.
paxdiablo
A: 

May be you have ftped the file in ascii mode instead of binary mode ? If not, this might help.

$gunzip myarchive.tar.gz

And then untar the resulting tar file using

$tar xvf myarchive.tar

Hope this helps.

Saradhi
+1  A: 

I had a similar problem with truncated tar files being produced by a cron job and redirecting standard out to a file fixed the issue.

From talking to a colleague, cron creates a pipe and limits the amount of output that can be sent to standard out. I fixed mine by removing -v from my tar command, making it much less verbose and keeping the error output in the same spot as the rest of my cron jobs. If you need the verbose tar output, you'll need to redirect to a file, though.

Jerome