views:

57

answers:

3

I have a Django application. I have .json fixture files containing test data, with unit tests that use the data to confirm the application is working properly. I also use South to migrate my database.

After doing a few database migrations, my fixtures are out of date, because the database has migrated, adding a new database column, for example, and the fixture data doesn't have that column, since it was captured before the database changed.

What's the best way to move my fixtures forward as I migrate my database?

A: 

What's the best way to move my fixtures forward as I migrate my database?

It's too late.

As you migrate your database you need to loaddata and dumpdata.

One it stops working, it's too late.

A possible fallback is to write a short script to load the JSON fixtures into memory, and then "manually" build database objects.

with open( "somefile.json", "r" ) as data:
    for obj in json.load( data ):
        if obj['model'] == 'someapp.somemodel':
            SomeNewModel.objects.create( 
                field = obj['fields']['element']
                ...
                )

With something along those lines, you might be able to construct a database using your current schema and legacy fixtures.

S.Lott
Thanks, but I don't believe it's ever too late. I can always roll back my code and/or my database to a previous state, so I'm sure I can get back on the right track. Could you provide a little more detail about how loaddata/dumpdata would work? I assume there's a migration in the middle, for example. But if my fixture is only for one application, how will South know to apply the migration?
Ned Batchelder
@Ned Batchelder: "I can always roll back my code and/or my database to a previous state," While true, that's too complex. It's really too late. Please read this for loaddata and dumpdata: http://docs.djangoproject.com/en/dev/ref/django-admin/.
S.Lott
It would be a killer feature for South if it were to migrate fixtures, too.
Matthew Schinckel
You CAN write data migrations that load django.core.management and run the loaddata command, BUT.... you may well run in to a world of hurt as schemas change but the fixtures don't. I would suggest a Makefile with a ./manage.py dumpdata foo bar baz.MySpecialModel etc so that you can easily update your fixtures and commit them to your source control after creating a new migration.
stevejalim
+1  A: 

Why can't you simply create a fresh .json file from your db. This is what I do when I need to create a new fixture.

python manage.py dumpdata <your_app> auth > test_data.json
MovieYoda
Because my test data is a controlled sample, not simply the latest snapshot of my production database.
Ned Batchelder
+2  A: 

Here's the process I used:

  1. Roll back the code to the revision that created the fixture in the first place. For example: svn up -r12345.

  2. Empty the database, then create it with manage.py syncdb --noinput --migrate

  3. Load the fixture with manage.py loaddata my_fixture.json

  4. Roll the code forward to now, with svn up

  5. Migrate the database with manage.py migrate

  6. Dump the data with manage.py dumpdata --indent=2 myapp >my_fixture.json

Note that you need to be careful when choosing the past revision to roll back to. In my case, I had some recent fixes that needed to be in place, so I actually had to pick and choose directories to roll back to specific revisions. Tedious, but better than hand-editing a 9,000-line JSON file.

Also, in step 6, be sure to dump the correct set of applications.

In the future, as I write migrations, I can do these steps again to keep all the fixtures up-to-date.

Ned Batchelder
I'm at a similar point although I'm just starting out, looking to take a test-first approach. The whole 'empty the database, load the test fixture, migrate, dump the test fixture' process is going to kill the workflow. Hopefully others will suggest (or code ;-) ) a more elegant solution.
dwightgunning