views:

45

answers:

1

We need to dump existing production data and then import it to a development DB for final rounds of testing.

In pseudo-commands, currently we:

mysqldump ....
mysql -u __USER__ __DB__ < ./dbdump.sql
mysql "sanitize script"

The problem is that the middle command does the import and takes over an hour to import 600MB of data. Maybe this is because of all the indexes we have but it amazes me that this is the only real import method.

Is there a way to get the direct storage data rather, than a SQL dump, and then just replace my local SQl data? Seems crazy to dump all the data into SQL commands and then have to execute those commands.

+1  A: 

if you use myisam you can just copy the database files. You might have to do a "repair table" sometimes.

Another option might be to use "load_data_infile". According to the mysql documentation it is 20 times faster than insert statements. see http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html

mitch