views:

154

answers:

2

I've usually use mysqldump to export a database. However, when the database is really large, it seems much faster and less intensive to just gzip the database files directly without involving the MySQL deamon and copying that to the other database.

eg:

tar -czvf {db_name}.sql.tgz /var/lib/mysql/{db_name}

Is this a good method of doing this? What are the (dis)advantages?

I also read another post here that mentioned:

rsync /var/lib/mysql/ ...

Would it be a good option to just use rsync to keep a backup db in sync with the development db?

A: 

Re the first method you mention: I think mySQL's official mysqlhotcopy uses that, so it can't be entirely wrong.

Pekka
+1  A: 

I've used rsync just fine with moving files around and using them on other boxes. I've never done it with the MYSQL running live, but i've restored from the files in /var/lib/mysql before with no problems. It's a good way to "copy" over databases for your "development" box. I suggest shutting down mysql, moving the files over then starting it back up again. That is how I've done it when necessary.

myqldump gives you nice neat SQL code though, good if you ever need to "tweak" something with sed along the way.

I'd have no worries about using rsync though. I use it for many purposes including pushing code updates out to client machines.

Mech Software