views:

311

answers:

3

I need to move a database with a Django schema from Postgres to MySQL, running on Amazon's RDF. I can re-generate the tables using manage.py, but I'm still looking for a way to migrate over all of the row data. Does anyone know a clean way of moving it over? Are there any gotchas to watch out for with Amazon's RDF?

+2  A: 

If you backup your Postgresql database and choose the "Insert Commands" option you will end up with a text file of SQL insert statements that in theory you can run against a different SQL database, in this case MySQL. In practice it's going to depend on the database types that you are using since not all Postgresql and MySQL data types map directly to each other.

The gotcha for me with Amazon's RDS is the statement that there will be a 4 hour window each week where the database needs to be taken down for patching and maintenance. The hard thing with relational databases is making them fault tolerant with replication and clustering. At the moment Amazon's RDS doesn't solve either of those issues and looks like it's simply using an EC2 instance to host a MySQL server which is something that's been possible to do for a long time without RDS.

Personally I love the direction RDS is heading though and look forward to the day I can treat a relational database as a black box and not have to worry about fault tolerance and scalability, like the Amazon SimpleDB product is now.

sipwiz
+2  A: 

Django also has the dumpdata and loaddata commands in manage.py. Process would be

  1. syncdb in MySQL to prepare the tables
  2. dumpdata from PostgreSQL
  3. loaddata from the previous command into the new MySQL instance
John Paulett