views:

4584

answers:

9

I have an application in Django, that needs to send a large number of emails to users in various use cases. I don't want to handle this synchronously within the application for obvious reasons.

Has anyone any recommendations for a message queuing server which integrates well with Python, or they have used on a Django project? The rest of my stack is Apache, mod_python, MySQL.

A: 

If you already have MySQL installed, you could create a table to use as a "todo list" of sorts.

Threads synchronously add jobs to the table, and a batched task removes jobs as they're completed.

That way, there's no need to install and learn more software, and it should work fine as a persistent job store so long as you're not sending lots of email (like >10/sec).

Alabaster Codify
+5  A: 

So far I have found no "nice" solution for this. I have some more strict soft realtime requirements (taking a picture from a cardboard box being labeled) so probably one of the approaches is fast enough for you. I assume emails can wait for a few minutes.

  • A "todo list" in the database processed by a cron job.
  • A "todo list" in the database processed permanently beeing polled by a daemon.
  • Using a custom daemon which gets notified by the webserver via an UDP packet (in Production today). Basically my own Queing system with the IP stack for handling the queue.
  • Using ActiveMQ as a message broker - this didn't work out because of stability issues. Also to me Java Daemons are generally somewhat plump
  • Using Update Triggers in CouchDB. Nice but Update Triggers are not meant to do heavy image processing, so no good fit for my problem.

So far I haven't tried RabbitMQ and XMPP/ejabebrd for handling the problem but they are on my list of next things to try. RabbitMQ got decent Python connectivity during 2008 and there are tons of XMPP libraries.

But perhaps all you need is a correctly configured mailserver on the local machine. This probably would allow you to dump mails synchronously into the local mailserver and thus make your whole software stack much more simple.

mdorseif
Which version of activemq did you try? Where I work we have written an entire NMS based on it and had issues with the newer version but not the older 4.x release.
fuzzy-waffle
I dimmly remember we used 4.x "stable" and beta releases.
mdorseif
BTW: we now are using RabbitMQ in production and are very happy bunnies so far.
mdorseif
A: 

You can also use Twisted for this. But it won't play with django in all cases, it's very dependant on deployment scenarios. The most important thing is every request has to be served by one python process, so you need apache compiled in threaded mode.

Vasil
+1  A: 

Just add the emails to a database, and then write another script ran by some task scheduler utility (cron comes to mind) to send the emails.

nosklo
+12  A: 

In your specific case, where it's just an email queue, I wold take the easy way out and use django-mailer. As a nice side bonues there are other pluggable projects that are smart enough to take advantage of django-mailer when they see it in the stack.

As for more general queue solutions, I haven't been able to try any of these yet, but here's a list of ones that look more interesting to me:

  1. pybeanstalk/beanstalkd
  2. python interface to gearman (which is probably much more interesting now with the release of the C version of gearman)
  3. memcacheQ
  4. stomp
Van Gale
For this specific need, avoid the hassles and use django-mailer; it works great. +1
Carl Meyer
I'm now experimenting with http://pypi.python.org/pypi/celery/ and it's looking very very good.
Van Gale
+6  A: 

Stompserver is a good option. It's lightweight, easy to install and easy to use from Django/python.

We have a system using stompserver in production for sending out emails and processing other jobs asynchronously.

Django saves the emails to the database, a model.post_save handler in Django sends an event to stompserver and stompserver passes the event to a consumer process which does the asynchronous task (sends the email).

It scales up quite nicely because you can add consumer processes at runtime - two consumers can send twice as many emails, and the consumers can be on seperate machines. One slight complication is that each consumer needs its own named queue so Django needs to know how many consumers are available and send events to each queue in a round-robin way. (Two consumers listening on the same queue will both get each message = duplication). If you only want one consumer process then this isn't an issue.

We previously had processes which polled the database continuously for jobs but found that it was adding a lot of load to the system, even when nothing needed to be processed.

Gruff
A: 

Celery Rocks!!!!

Sameer
A: 

You might want to have a look at pymq. It's written in python, talks HTTP with it's clients and allows a host of monitoring and management options for queues.

dhruvbird
A: 

Is there anything wrong is solving this using the mail infrastructure? Like, every app server running their own mail daemons which will queue any locally submitted mail, which forward to a centralized mail server which can do the mail heavy lifting?

Sandip Bhattacharya