views:

195

answers:

4

So I've been building django applications for a while now, and drinking the cool-aid and all: only using the ORM and never writing custom SQL.

The main page of the site (the primary interface where users will spend 80% - 90% of their time) was getting slow once you have a large amount of user specific content (ie photos, friends, other data, etc)

So I popped in the sql logger (was pre-installed with pinax, I just enabled it in the settings) and imagine my surprise when it reported over 500 database queries!! With hand coded sql I hardly ever ran more than 50 on the most complex pages.

In hindsight it's not all together surprising, but it seems that this can't be good.

...even if only a dozen or so of the queries take 1ms+

So I'm wondering, how much overhead is there on a round trip to mysql? django and mysql are running on the same server so there shouldn't be any networking related overhead.

+1  A: 

There is always overhead in database calls, in your case the overhead is not that bad because the application and database are on the same machine so there is no network latency but there is still a significant cost.

When you make a request to the database it has to prepare to service that request by doing a number of things including:

  • Allocating resources (memory buffers, temp tables etc) to the database server connection/thread that will handle the request,
  • De-serializing the sql and parameters (this is necessary even on one machine as this is an inter-process request unless you are using an embeded database)
  • Checking whether the query exists in the query cache if not optimise it and put it in the cache.
    • Note also that if your queries are not parametrised (that is the values are not separated from the SQL) this may result in cache misses for statements that should be the same meaning that each request results in the query being analysed and optimized each time.
  • Process the query.
  • Prepare and return the results to the client.

This is just an overview of the kinds of things the most database management systems do to process an SQL request. You incur this overhead 500 times even if the the query itself runs relatively quickly. Bottom line database interactions even to local database are not as cheap as you might expect.

Tendayi Mawushe
+3  A: 

The overhead of each queries is only part of the picture. The actual round trip time between your Django and Mysql servers is probably very small since most of your queries are coming back in less than a one millisecond. The bigger problem is that the number of queries issued to your database can quickly overwhelm it. 500 queries for a page is way to much, even 50 seems like a lot to me. If ten users view complicated pages you're now up to 5000 queries.

The round trip time to the database server is more of a factor when the caller is accessing the database from a Wide Area Network, where roundtrips can easily be between 20ms and 100ms.

I would definitely look into using some kind of caching.

BeWarned
+2  A: 

There are some ways to reduce the query volume.

  1. Use .filter() and .all() to get a bunch of things; pick and choose in the view function (or template via {%if%}). Python can process a batch of rows faster than MySQL.

    "But I could send too much to the template". True, but you'll execute fewer SQL requests. Measure to see which is better.

    This is what you used to do when you wrote SQL. It's not wrong -- it doesn't break the ORM -- but it optimizes the underlying DB work and puts the processing into the view function and the template.

  2. Avoid query navigation in the template. When you do {{foo.bar.baz.quux}}, SQL is used to get the bar associated with foo, then the baz associated with the bar, then the quux associated with baz. You may be able to reduce this query business with some careful .filter() and Python processing to assemble a useful tuple in the view function.

    Again, this was something you used to do when you hand-crafted SQL. In this case, you gather larger batches of ORM-managed objects in the view function and do your filtering in Python instead of via a lot of individual ORM requests.

    This doesn't break the ORM. It changes the usage profile from lots of little queries to a few bigger queries.

S.Lott
+4  A: 

Just because you are using an ORM doesn't mean that you shouldn't do performance tuning.

I had - like you - a home page of one of my applications that had low performance. I saw that I was doing hundreds of queries to display that page. I went looking at my code and realized that with some careful use of select_related() my queries would bring more of the data I needed - I went from hundreds of queries to tens.

You can also run a SQL profiler and see if there aren't indices that would help your most common queries - you know, standard database stuff.

Caching is also your friend, I would think. If a lot of a page is not changing, do you need to query the database every single time?

If all else fails, remember: the ORM is great, and yes - you should try to use it because it is the Django philosophy; but you are not married to it.

If you really have a usecase where studying and tuning the ORM navigation didn't help, if you are sure that you could do it much better with a standard query: use raw sql for that case.

celopes