views:

397

answers:

6

I have a fairly small MySQL database (a Textpattern install) on a server that I do not have SSH access to (I have FTP access only). I need to regularly download the live database to my local dev server on demand; i.e., I would like to either run a script and/or have a cron job running. What are some good ways of doing this?

Some points to note:

  • Live server is running Linux, Apache 2.2, PHP 5.2 and MySQL 4.1
  • Local server is running the same (so using PHP is an option), but the OS is Windows
  • Local server has Ruby on it (so using Ruby is a valid option)
  • The live MySQL db can accept remote connections from different IPs
  • I cannot enable replication on the remote server

Update: I've accepted BlaM's answer; it is beautifully simple. Can't believe I didn't think of that. There was one problem, though: I wanted to automate the process, but the proposed solution prompts the user for a password. Here is a slightly modified version of the mysqldump command that passes in the password:

mysqldump -u USER --password=MYPASSWORD DATABASE_TO_DUMP -h HOST > backup.sql

+1  A: 

Is MySQL replication an option? You could even turn it on and off if you didn't want it constantly replicating.

This was a good article on replication.

Mark Biek
A: 

I would create a (Ruby) script to do a SELECT * FROM ... on all the databases on the server and then do a DROP DATABASE ... followed by a series of new INSERTs on the local copy. You can do a SHOW DATABASES query to list the databases dynamically. Now, this assumes that the table structure doesn't change, but if you want to support table changes also you could add a SHOW CREATE TABLE ... query and a corresponding CREATE TABLE statement for each table in each database. To get a list of all the tables in a database you do a SHOW TABLES query.

Once you have the script you can set it up as a scheduled job to run as often as you need.

Anders Sandvig
A: 

@Mark Biek

Is MySQL replication an option? You could even turn it on and off if you didn't want it constantly replicating.

Thanks for the suggestion, but I cannot enable replication on the server. It is a shared server with very little room for maneuver. I've updated the question to note this.

Charles Roper
A: 

Depending on how often you need to copy down live data and how quickly you need to do it, installing phpMyAdmin on both machines might be an option. You can export and import DBs, but you'd have to do it manually. If it's a small DB (and it sounds like it is), and you don't need live data copied over too often, it might work well for what you need.

Brock Boland
+4  A: 

Since you can access your database remotely, you can use mysqldump from your windows machine to fetch the remote database. From commandline:

cd "into mysql directory"
mysqldump -u USERNAME -p -h YOUR_HOST_IP DATABASE_TO_MIRROR >c:\backup\database.sql

The program will ask you for the database password and then generate a file c:\backup\database.sql that you can run on your windows machine to insert the data.

With a small database that should be fairly fast.

BlaM
+1  A: 

Here's what I use. This dump the database from the live server while uploads it to the local server. mysqldump -hlive_server_addresss -ulive_server_user -plive_server_password --opt --compress live_server_db | mysql -ulocal_server_user -plocal_server_password local_server_db

You can run this from a bat file. You can ever use a scheduled task.

The Disintegrator