views:

529

answers:

5

I have a script that will produce daily rotated backups for mysql, but I can't find anything similar for postgres. I have also discovered that it has an online backup capability, which should come in handy since this is a production site.

Does anyone know of a program/script that will help me, or even a way to do it?

Thanks.

+9  A: 

One way is to use pg_dump to generate a flat sql dump, which you can gzip or whatever. This is certainly the easiest option, as the results can be piped back in to psql to re-load a database, and since it can also export as plain text, you can look through or edit the data prior to restore if necessary.

The next method is to temporarily shut down your database (or if your filesystem supports atomic snapshots, in theory that might work) and backup your PostgreSQL data directory.

This page from the PostgreSQL site also explains how to do online backups and point-in-time recovery, which is definitely the most difficult to configure, but also the optimal method. The idea is that you perform a base backup (which you might do every day, couple of days or week) by running some special SQL (pg_start_backup and pg_stop_backup) and make a (filesystem-level) copy of your database directory. The database doesn't go offline during this time, and everything still works as normal. From then on, the database generates a Write Ahead Log (WAL) of any changes, which can then be pushed (automatically, by the database) to wherever you want. To restore, you take the base backup, load it into another database instance, then just replay all the WAL files. This way you can also do point-in-time recovery by not replaying all of the logs.

Adam Batkin
Using atomic snapshots doesn't just "might work in theory". It works perfectly fine. But they need to be atomic across *all* filesystems that PostgreSQL has data on, including the pg_xlog directory.
Magnus Hagander
I usually use pg_dump to do my backups, but I was more looking for a complete solution that used pg_dump to create rotated backups, something like http://sourceforge.net/projects/automysqlbackup/. It would be even better if the script used online backups, but I think it shouldn't be hard to do it myself and put it up on Launchpad or somewhere.Thank you for your answer.
Stavros Korokithakis
for an automated way of doing pg_dump backups with public key/password encryption, rotation, and upload to S3 check out safe: http://github.com/astrails/safe
Vitaly Kushner
A: 

You can also dump your PostgreSQL database using phpPgAdmin or pgAdmin III.

Randell
+2  A: 

Generally the way to do backups is to use pg_dump.

You shouldn't "copy files from postgresql directory, just like in mysql" - because chances are you will not be able to use them (these files are architecture, operating system, and compile-options dependent).

Unless pg_dump is proven to be insufficient - this is what you should use. After you will be in situation that pg_dump cannot be used - you should ask yourself: why it can't be used, and what can you do to use it again :)

When using pg_dump you might choose plain SQL file dump (-F p), or custom format (-F c). SQL dump is easier to modify/change, but the custom format is much more powerful, and (since 8.4) faster to load, because you can load it in many parallel workers instead of sequentially.

depesz
+1  A: 

For automated backups of both MySQL And Postrgres check out astrails-safe on github (or just "gem install astrails-safe --source=http://gems.github.com"). It uses mysqldump to backup MySQL and pg_dump to backup Postgres. It also knows how to backup plain files with tar and encrypt everything with GnuPG and upload to S3, or any Unix server with SFTP.

Vitaly Kushner
A: 

Since you specified databaseS, pg_dumpall will be far more useful to you. It dumps all databases and users to a sql file instead of just a single one.

chotchki