views:

178

answers:

1

I have web pages that take 10 - 20 database queries in order to get all the required data.

Normally after a query is sent out, the Django thread/process is blocked waiting for the results to come back, then it'd resume execution until it reaches the next query.

Is there's any way to issue all queries asynchronously so that they can be processed by the database server(s) in parallel?

I'm using MySQL but would like to hear about solutions for other databases too. For example I heard that Postgresql has an async client library - how would I use that in this case?

+1  A: 

This very recent blog entry seems to imply that it's not built in to either the django or rails frameworks. I think it covers the issue well and is quite worth a read along with the comments.

http://www.eflorenzano.com/blog/post/how-do-we-kick-our-synchronous-addiction/

I think I remember Cal Henderson mentioning this deficiency somewhere in his excellent speech http://www.youtube.com/watch?v=i6Fr65PFqfk

My naive guess is you might be able to hack something with separate python libraries but you would lose a lot of the ORM/template lazy evaluation stuff django gives to the point you might as well be using another stack. Then again if you are only optimizing a few views in a large django project it might be fine.

michael