views:

260

answers:

3

I have a fairly complex Django project which makes it hard/impossible to use fixtures for loading data.

What I would like to do is to load a database dump from the production database server after all tables has bene created by the testrunner and before the actual tests start running.

I've tried various "magic" in MyTestCase.setUp(), but with no luck.

Any suggestions would be most welcome. Thanks.

A: 

Fixtures are the best option. Have you tried using ./manage.py dumpdata to create a fixture from your current database? I have not seen that fail on complex models, but I guess it's possible.

Assuming you're using mysql, you should be able to script this by using mysqldump.

Chase Seibert
A: 

You may need to look into defining a custom test runner. There's some info here: http://docs.djangoproject.com/en/dev/topics/testing/#using-different-testing-frameworks

Basically I think you can just copy the default test runner from django.test.simple.run_tests and then modify it to suit your needs.

I've not done this before, but from my understanding that would be the way to customize this.

Jeff
+3  A: 

Django supports loading SQL files when doing syncdb, reset, or starting a test runner -- this does exactly what you describe:

http://docs.djangoproject.com/en/dev/howto/initial-data/#providing-initial-sql-data

You need to create an "sql" directory in your app directory, and then put a file named "mymodel.sql" in that directory (where "MyModel" is the corresponding model name).

myproject/
   |--myapp/
       |--sql/
           |--mymodel.sql

You can create this SQL with dump tools for your database.

  • SQLite [1]: echo '.dump' | sqlite3 yourdbname.sqlite > myapp/sql/mymodel.sql
  • MySQL [2]: mysqldump yourdbname > myapp/sql/mymodel.sql
  • PostgreSQL [3]: pg_dump yourdbname > myapp/sql/mymodel.sql

After dumping, you'll need to edit the file to remove everything but the appropriate INSERT statements, or other complicated stuff. In particular, you must remove transaction handling, index creating, and table creating SQL to avoid errors when loading duplicate create statements.

I use this method for loading really, really big fixtures -- it takes far too long to process the json, but a straight sql import is pretty snappy.

Do be aware that this method will load the sql for any invocation of synchdb, reset, etc. in addition to loading data for the test runner -- so you won't be able to have different data for different test cases, and you'd have to remove files before a reset if you didn't want them loading back onto your production server.

[1] http://www.sqlite.org/sqlite.html

[2] http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html

[3] http://www.postgresql.org/docs/8.1/static/backup.html#BACKUP-DUMP