I'm building a mobile photo sharing site in Python similar to TwitPic and have been exploring various queues to handle the image processing. I've looked into RabbitMQ and ActiveMQ but I'm thinking that there is a better solution for my use case. I'm looking for something a little more lightweight. I'm open to any suggestions.
You could write a daemon that uses python's built-in multiprocessing library and its Queue.
All you should have to do is set up a pool of workers, and have them wait on jobs from the Queue. Your main process can dump new jobs into the Queue, and you're good to go.
Are you considering single machine architecture, or a cluster of machines? Forwarding the image to an available worker process on the same machine or a different machine isn't profoundly different, particularly if you use TCP sockets. Knowing what workers are available, spawning more if necessary and the resources are available, having a fail-safe mechanism if a worker crashes, etc, gradually make the problem more complicated.
It could be something as simple as using httplib to push the image to a private server running Apache or twisted and a collection of cgi applications. When you add another server, round robin the request amongst them.
Gearman is good in that it optionally allows you to synchronize multiple jobs executed on multiple workers.
I've used beanstalkd successfully in a few high-volume applications.
The latter is better-suited to async jobs, and the former gives you more flexibility when you'd like to block on job execution.