views:

281

answers:

5

Here's one for the bash-fu wizards. No, actually, I'm just kidding, you'll all probably know this except for me..

I'm trying to create a backup shell script. The idea is fairly simple: find files in a certain folder, older than 7 days, tar/gzip them to another directory, and remove them. The problem is, I'm not sure if I'll have enough permissions to create a tar/gzip file in the target dir. Is there any (proper) way to check if the file has been created successfully, and if so, delete the files. Otherwise, skip that part and don't destroy customers' data. I hear they are not very fond of that.

Here's what I have so far:

01: #!/bin/bash
02: 
03: ROOTDIR="/data/www"
04: 
05: TAR="${ROOTDIR}/log/svg_out_xml/export_out_ack_$(date +%Y-%m-%d).tar"
06: cd ${ROOTDIR}/exchange/export/out_ack/
07: find . -mtime +7 -type f -print0 | xargs -0 tar -cf "${TAR}"
08: gzip ${TAR}
09: find . -mtime +7 -type f -print0 | xargs -0 rm -f

Basically, I'd need to check if everything went fine on lines 7 and 8, and if so execute 9.

Additionally, I'd like to make a log file of these operations so I know everything went fine (this is a nightly cron job).

+4  A: 

The easy way out, but with no explicit error message add -e to the shebang, i.e. #!/bin/sh -e which will cause the shell to exit if a command fails..

Cron should give you an error message through the mail I guess though.

If you want to go full blown back-up scheme though, I'd suggest you use something that has already been made. There are a bunch out there, and most work very well.

roe
Excellent suggestion. I always use `-e` in my scripts.
Martin Wickman
Thanks for the tip, very useful! Unfortunately, cron won't send us anything as we are not the administrators of this system. It would be nice though..
dr Hannibal Lecter
@dr: It will if you define the MAILTO in your crontab.
roe
+3  A: 

GNU tar has --remove-files that will remove files once they've been added to the archive. v will cause it to list out files as it adds them. z will pipe the tar through gzip on the fly.

Your find solution is racy; a file may fit the criteria in between invocations, thus getting deleted but not backed up.

Ignacio Vazquez-Abrams
Thanks for your answer. It is highly unlikely that a file will fit between invocations, since this script will run at 4AM when no files should be created in the folder I'm backing up. Of course, there's always that one user..
dr Hannibal Lecter
+2  A: 

I think generally the following solution could be used for checking errors in bash if you rely on the program's correct return value:

if [ "$?" -ne "0" ]; then 
    echo An error has occurred. The script will exit now
    exit 1; 
fi

It could be put into a function if used many times in the script.

Dmitry Yudakov
Thanks, will try it.
dr Hannibal Lecter
+1  A: 

For logging, you can wrap sections of your script in curly braces and redirect the stdout to a log file:

{
    script_command_1
    script_command_2
    script_command_3
} >> /path/to/log_file
Dennis Williamson
dr Hannibal Lecter
@dr: Yes, it will.
Dennis Williamson
+1  A: 
Idelic
+1, Awesome info!
dr Hannibal Lecter
`1>` is implied in `>`
Dennis Williamson
but 1> is clearer.
Idelic