views:

1702

answers:

3

I can run commands like vacuumdb, pg_dump, and psql just fine in a script if I preface them like so:

/usr/bin/sudo -u postgres /usr/bin/pg_dump -Fc mydatabase > /opt/postgresql/prevac.gz
/usr/bin/sudo -u postgres /usr/bin/vacuumdb --analyze mydatabase
/usr/bin/sudo -u postgres /usr/bin/pg_dump -Fc mydatabase > /opt/postgresql/postvac.gz
SCHEMA_BACKUP="/opt/postgresql/$(date +%w).db.schema"
sudo -u postgres /usr/bin/pg_dump -C -s mydatabase > $SCHEMA_BACKUP

These run at command line on Redhat when I am sudo to root and then as you see in the commands above I do a sudo -u to postgres.

But when I try to kick this off from cron, I get zero bytes in all the files -- meaning it didn't run properly. And I don't get a clue in the logs that I can see.

My /etc/crontab file has this entry at the bottom

00 23 * * * root /etc/db_backup.cron

And yes, /etc/db_backup.cron is chmod ug+x, owned by root, and the top of the file says "#!/bin/bash" (minus doublequotes).

Anyone know what gives?

+1  A: 

Your environment variable are maybe not set in cron.

In your normal session, you probably have defined these variables:

PG_PORT
PG_HOST
PG_DATABASE
PG_USERNAME
PG_PASSWORD

Add an "env" into yout script.

Luc M
I need a little more help than that I think. I put 'env' in there before the commands and after the #!/bin/bash and all it did was dump out the environment to the log file (because I'm now dumping the output to a logfile as it runs).
Volomike
The way to set these variables would be'export PG_PORT=5432' etc.
jhwist
Those aren't exactly the variables used by pg_dump and vacuumdb. You can see the full list here: http://www.postgresql.org/docs/8.4/static/libpq-envars.html
bilygates
@bilitgates Nice!! Thank you. I've learned something today :-)
Luc M
+4  A: 

Since you seem to have superuser rights anyway, you could put those commands into the crontab of the postgres user like so:

sudo su postgres
crontab -e

and then put the pg_dump/vacuumdb commands there.

jhwist
This is what I ultimately ended up doing and it worked fine only when I did chown postgres.root /opt/postgresql (where the files were going).
Volomike
''sudo su postgres'' That's crazy! Try ''sudo -u postgres''. Setup a sudo spec, it's much safer.
Clint Pachl
+1  A: 

I have a dynamic bash script that backs up all the databases on the server. It gets a list of all the databases and then vacuums each DB before performing a backup. All logs are written to a file and then that log is emailed to me. This is something you could use if you want.

Copy the code below into a file and add the file to your crontab. I have setup my pg_hba.conf to trust local connections.

#!/bin/bash
logfile="/backup/pgsql.log"
backup_dir="/backup"
touch $logfile
databases=`psql -h localhost -U postgres -q -c "\l" | sed -n 4,/\eof/p | grep -v rows\) | grep -v template0 | grep -v template1 | awk {'print $1'}`

echo "Starting backup of databases " >> $logfile
for i in $databases; do
        dateinfo=`date '+%Y-%m-%d %H:%M:%S'`
        timeslot=`date '+%Y%m%d%H%M'`
        /usr/bin/vacuumdb -z -h localhost -U postgres $i >/dev/null 2>&1
        /usr/bin/pg_dump -U postgres -i -F c -b $i -h 127.0.0.1 -f $backup_dir/$i-database-$timeslot.backup
        echo "Backup and Vacuum complete on $dateinfo for database: $i " >> $logfile
done
echo "Done backup of databases " >> $logfile

tail -15 /backup/pgsql.log | mailx [email protected]
You can add '--analyze' to the /usr/bin/vacuumdb command to perform an analyze as well.