views:

45

answers:

2

Hi everyone,

I've got this project running under mod_perl shows some information on a host. On this page is a text box with a dropdown that allows users to ping/nslookup/traceroute the host. The output is shown in the text box like a tail -f.

It works great under CGI. When the user requests a ping it would make an AJAX call to the server, where it essentially starts the ping with the output going to a temp file. Then subsequent ajax calls would 'tail' the file so that the output was updated until the ping finished. Once the job finished, the temp file would be removed.

However, under mod_perl no matter what I do I can's stop it from creating zombie processes. I've tried everything, double forking, using IPC::Run etc. In the end, system calls are not encouraged under mod_perl.

So my question is, maybe there's a better way to do this? Is there a CPAN module available for creating command line jobs and tailing output that will work under mod_perl? I'm just looking for some suggestions.

I know I could probably create some sort of 'job' daemon that I signal with details and get updates from. It would run the commands and keep track of their status etc. But is there a simpler way?

Thanks in advance.

A: 

See if this brian d foy's asnwer helps:

http://stackoverflow.com/questions/2711520/how-can-i-run-perl-system-commands-in-the-background/2715086#2715086

DVK
This should be a comment, not an answer.
Ether
Yeah I've looked some of the CPAN modules in that post. It put me in a good direction. But I need a module to persist the job status, so that each time the AJAX script is called it can load from file/db/whatever what the status of the job is (by an ID or something). Does anyone know of a CPAN module that does this? I'll keep searching...
Matthew
@Matthew - to be honest I'd go with straightforward approach... make a unique ID for the original job (say, a combination of $$ and timestamp), incorporate it into filename, and then return the ID from the original request to be stored in the page... then each AJAX call sends that ID to the server and the script reading the log an reconstruct the log file name from it
DVK
Yep, I've done something similar to that, I'll post my solution.
Matthew
A: 

Hi everyone, I had a short timeframe on this one and had no luck with CPAN, so I'll provide my solution here (I probably re-invented the wheel). I had to get something done right away.

I'll use ping in this example.

When ping is requested by the user, the AJAX script creates a record in a database with the details of the ping (host, interval, count etc.). The record has an auto-incrementing ID field. It then sends a SIGHUP to to a job daemon, which is just a daemonised perl script.

This job daemon receives the SIGHUP, looks for new jobs in the database and processes each one. When it gets a new job, it forks, writes the PID and 'running' status to the DB record, opens up stdout/stderr files based on the unique job ID and uses IPC::Run to direct STDOUT/STDERR to these files.

The job daemon keeps track of the forked jobs, killing them if they run too long etc.

To tail the output, the AJAX script send back the job ID to the browser. Then on a Javascript timer, the AJAX script is called which basically checks the status of the job via the database record and tails the files.

When the ping finishes, the job daemon sets the record status to 'done'. The AJAX script checks for this on it's regular status checks.

One of the reasons I did it this way is that the AJAX script and the job daemon talk through and authenticated means (the DB).

Matthew