views:

236

answers:

2

I'm trying to use dumpdata to generate JSON for a database that is sufficiently large for django to take a long, long time to output. Is there any way to dump only a subset of the fields; say, 100, for testing?

I'm using MySQL and Django 1.0.

+1  A: 

As far as I know, there is no command line option for dumpdata to limit the number of rows to be exported.

The dumpdata command is implemented in django/django/core/management/commands/dumpdata.py. There, around line 83 one can see:

for model in sort_dependencies(app_list.items()):
    if not model._meta.proxy:
        objects.extend(model._default_manager.using(using).all())

You could create a new file in this directory for a custom command e.g. dumplimited.py where you change the above to something like ...

...
        objects.extend(model._default_manager.using(using).all()[:100])

... to include a LIMIT clause.

That's a bit hacky, but save, since you can add your own command without touching any of django's files.

PS. Once this works, you can parameterize the limit and add an addition option to the command, such as --limit 128 or similar.

The MYYN
+3  A: 

A 3rd party django app, django-test-utils contains a makefixture command implementation which is basically a smarter dumpdata. You can specify exact model names with ID ranges to export (and it will follow related objects) Example: manage.py makefixture --format=xml --indent=4 proj.appname.modelname[1:101] > test.xml

Béres Botond