views:

149

answers:

6

I am using php, mysql and my server is Windows Server 2003 with IIS6.

I am planning to backup my database on hourly basis. I can do the cronjob, tested by write date&time into a log file. It works perfectly.

Now I want to use the cronjob to backup my databases. How do I do that? Create a php script and let it run every hour??

A: 

Windows has built-in Task Scheduler which you can use to perform your backups.

Anton Gogolev
Can you please explain more on how to use Task Scheduler to back up databases on hourly basis??
mysqllearner
A: 

Use the mysqldump tool to dump your databases to a file.

Ignacio Vazquez-Abrams
+2  A: 

Use mysqldump. Example (with the options that I usually use):

mysqldump --single-transaction --hex-blob --opt -e --quick --quote-names -r put-your-backup-filename-here put-your-database-name-here
Emil Vikström
Do I need to create a batch file and make it run it every hour? I am new to mysqldumb, can you elaborate more on this?
mysqllearner
Yes, do it in a .bat file. Mysqldump is run from the commandline (e.g. "DOS" window).
Emil Vikström
A: 

I think you can user cron job to start a copy of the db files, I don't think that cron job to run php script that makes backup is a good choice (it's complex without need it). I think that also mysqldump is a good choice, but I can't help you about it.

Pons
It's not a good idea at all to copy the files when the database is running. That can cause corrupt files which in the worst case can't be repaired by MySQL. It's absolutely horrible as a backup solution.
Emil Vikström
A: 

Edit: This script is intended for Unix (or variants).

I wanted to do the same (including sending an email of the backup and archiving my code/pages as well) and wrote something like this:

#! /bin/bash

NOW=`date +"%Y-%m-%d"`
MAIL_TO="[email protected]";
MAIL_SUBJECT="Hourly backup"
MAIL_MESSAGE="mail-message";

DB_FILE="backup-database-$NOW.sql.gz"
SITE_FILE="website-$NOW.tar.gz"

echo "Database dump:" >> $MAIL_MESSAGE
mysqldump --defaults-extra-file=.mysql-pwd --add-drop-table -C my_databse 2>> $MAIL_MESSAGE | gzip > $DB_FILE 2>> $MAIL_MESSAGE

echo "Site dump (www and php-include):" >> $MAIL_MESSAGE
tar -zcf $SITE_FILE /path/to/www/ /path/to/php-include/ 2>> $MAIL_MESSAGE

echo >> $MAIL_MESSAGE
echo >> $MAIL_MESSAGE
echo "Done" >> $MAIL_MESSAGE

mutt -s "$MAIL_SUBJECT" -a $DB_FILE -a $SITE_FILE $MAIL_TO < $MAIL_MESSAGE
Veger
This is cool. Maybe I know, could I use this script? Any problems encountered so far?
mysqllearner
Of course you can use (and modify for your own purposes) it, that is why I posted it. I am running it for more than half a year now (weekly backup) and I still get my email every week.
Veger
This is so COOL. Could you please help me on this? How do you do that? I edited your script in textpad, and now what file format should I save as? And how do I set it run daily/weekly/monthly etc? Please help
mysqllearner
It is a *unix bash file, so save it as backup.sh or something similar. Then let cron run it for you, by using crontab -e to add the command to start the backup.sh file
Veger
oh.. I am using Windows OS. Will there be any problems? Save as backup.sh, using task scheduler and set every week to call that file?
mysqllearner
Oh sorry misread you question. I do not exactly know how to convert this for Windows. The mysqldump command is probably the same. After that I archived it (saves lots of space) using gzip, better to split these lines for Windows I guess. The mutt command is a command line emailer (not necessary), but to do something similar you need to find a Windows based command line emailer. For windows the script can best be saved as something line backup.bat Finally you can add the script to the Task Scheduler.
Veger
Okay. thanks Veger. Looks like i need to study how to use mysqldump at the first place.
mysqllearner
+1  A: 

As others have written, the mysqldump tool is the simplest solution, however if you have a large database, even with the --quick setting, I'd recommend you do some experimenting. With the old C-ISAM engine, queries are processed one at a time. Although this is less of an issue with InnoDB, I'm not sure whether MySQL now supports full concurrent queries. Your backup may affect the transactional processing.

If it does prove to be a problem, then a simpple solution would be to configure a slave instance of the database running on the same machine, then either run mysqldump against that, or shutdown the slave temporarily while backing up the raw data files.

Alternatively you could push the mirroring down to the OS level and perform a disk level snapshot - but you'd need to stop the transactions on the database and flush it before creating the snapshot (or breaking the mirror).

C.

symcbean
+1 for creating (full) backup on a slave. Wish I could give another +1 for mentioning transactions and/or data consistency.
VolkerK