views:

166

answers:

2

I have an "appliance" (for lack of better description) running linux.

Currently I ssh into the box to launch jobs. This isn't friendly enough for my users, so I'm putting together a simple web UI to launch the script. A job runs for anywhere from 10 seconds to several hours. The web UI needs to reflect the status of the job.

I've solved similar problems in the past by running a daemon on the server that watches a spool directory (or db table) for new job requests, spawns a process, monitors the process, and provides info for the web UI in a db table or status file. The web UI then drops job requests into a spool dir (db) and occasionally check the status file (db). This might be overkill for this task.

For the current task, I am considering spawning the job from the cgi and occasionally checking a status file that the job writes as it progresses or exits.

My question: is there a better (simpler/faster-to-write/more robust) way to do this? Are there existing patterns or tools that I should know about?

(Python solutions are ideal.)

Thanks.

A: 

I'm not sure if I understood your problem correctly, but I assume you have multiple "jobs" that can run simultaneously and want them to show on web page whether they are complete or not?

When launching a job the web page (python & mod_wsgi for example) would launch a python script that would enter the job into, let's say sqlite database and run the job, once the job completes, the script updates the entry for the job so that it is marked as complete

The status page would just show the stuff from the sqlite.

What you want to put in to the DB in addition to the job ID and perhaps start/end times depends on what you want to show on your job status web page

On a sidenote - if the "jobs" are compilations, meet Hudson

Kimvais
Jobs can't run simultaneously, and they aren't compilations. (The "similar problem" I solved in the past was hudson-like.) The appliance is controlling external devices. Thanks for the answer, this is essentially what I've done before.
bstpierre
+2  A: 

I do this in a number of projects. A web-app (mostly Python/CGI) that spawns a separate python script (using subprocess) which instantly daemonizes itself to do the work. The web-app then continues to issue AJAX requests to check on the daemon process progress (I use simple txt files for communication, database would probably be better). One nice touch is to have the daemon email the end user once it finishes (with a link to retrieve results). This way the user can close their web browser on those jobs that take hours.

Mark
Mark, thanks for the tip on python-daemon. Does this support 2.6 or just 3.x?
bstpierre
I just installed and ran all the test successfully under 2.5. If you want a simpler solution (looking back this is what I'm using), here's a stand-alone code snippet: http://code.activestate.com/recipes/278731/
Mark
Sweet. This python-daemon was news to me too - now that I look at it I notice I have stuff running that I have done so wrong ...
Kimvais