tags:

views:

829

answers:

2

One of my API calls can result in updates to a large number of objects (Django models). I'm running into performance issues with this since I'm updating each item individually, saving, and moving on to the next:

for item in Something.objects.filter(x='y'):
    item.a="something"
    item.save()

Sometimes my filter criterion looks like "where x in ('a','b','c',...)".

It seems the official answer to this is "won't fix". I'm wondering what strategies people are using to improve performance in these scenarios.

+1  A: 

You need to use transactions or create the sql statement by hand. You could also try using SQLAlchemy which supports a few great ORM features like Unit of Work (or application transaction).

Django transactions: http://docs.djangoproject.com/en/dev/topics/db/transactions/?from=olddocs

SQLAlchemy: http://www.sqlalchemy.org/

Loki
+4  A: 

The ticket you linked to is for bulk creation - if you're not relying on an overridden save method or pre/post save signals to do bits of work on save, QuerySet has an update method which you can use to perform an UPDATE on the filtered rows:

Something.objects.filter(x__in=['a', 'b', 'c']).update(a='something')
insin
Nice, that looks like it'll cover a lot of cases. Will give that a try.
Parand
Worked great, much thanks.
Parand