When I use extra
in a certain way on a Django queryset (call it qs
), the result of qs.count()
is different than len(qs.all())
. To reproduce:
Make an empty Django project and app, then add a trivial model:
class Baz(models.Model):
pass
Now make a few objects:
>>> Baz(id=1).save()
>>> Baz(id=2).save()
>>> Baz(id=3).save()
>>> Baz(id=4).save()
Using the extra
method to select only some of them produces the expected count:
>>> Baz.objects.extra(where=['id > 2']).count()
2
>>> Baz.objects.extra(where=['-id < -2']).count()
2
But add a select
clause to the extra
and refer to it in the where
clause, and the count is suddenly wrong, even though the result of all()
is correct:
>>> Baz.objects.extra(select={'negid': '0 - id'}, where=['"negid" < -2']).all()
[<Baz: Baz object>, <Baz: Baz object>] # As expected
>>> Baz.objects.extra(select={'negid': '0 - id'}, where=['"negid" < -2']).count()
0 # Should be 2
I think the problem has to do with django.db.models.sql.query.BaseQuery.get_count()
. It checks whether the BaseQuery's select
or aggregate_select
attributes have been set; if so, it uses a subquery. But django.db.models.sql.query.BaseQuery.add_extra
adds only to the BaseQuery's extra
attribute, not select
or aggregate_select
.
How can I fix the problem? I know I could just use len(qs.all())
, but it would be nice to be able to pass the extra
'ed queryset to other parts of the code, and those parts may call count()
without knowing that it's broken.