views:

128

answers:

3

We have a collection of Unix scripts (and/or Python modules) that each perform a long running task. I would like to provide a web interface for them that does the following:

  • Asks for relevant data to pass into scripts.
  • Allows for starting/stopping/killing them.
  • Allows for monitoring the progress and/or other information provided by the scripts.
  • Possibly some kind of logging (although the scripts already do logging).

I do know how to write a server that does this (e.g. by using Python's built-in HTTP server/JSON), but doing this properly is non-trivial and I do not want to reinvent the wheel.

Are there any existing solutions that allow for maintaining asynchronous server-side tasks?

+1  A: 

Django is great for writing web applications, and the subprocess module (subprocess.Popen en .communicate()) is great for executing shell scripts. You can give it a stdin,stdout and stderr stream for communication if you want.

extraneon
Hah! I suspected that Django would be one of the first answers; however, while a nice framework that makes some things easier, I would still have to implement most of the needed interface. I was specifically hoping to avoid having to write all the views for monitoring running tasks.
knipknap
Woulnd't that make it more of a question for SuperUser? It sounds like the kind of thing system administrators do.
extraneon
A: 

I would use SGE, but I think it could be overkill for your need...

HeMan
+1  A: 

Answering my own question, I recently saw the announcement of Celery 1.0, which seems to do much of what I am looking for.

knipknap