views:

293

answers:

5

Hi,

I need a binary/script (php) that does the following.

Start n process of X in the background and maintain the number processes.

An example: - n = 50 - initially 50 processes are started - a process exits - 49 are still running - so 1 should be started again.

Please, this is urgent.

Thanks! Michael

P.S.: I posted the same question on SV, which makes me probably very unpopular. I know, but its still urgent.

A: 

Can you use the crontab linux and write to a db or file the number of current process?. If DB, the advantage is that you can use to procedure and lock the table, and write the number of process.

But to backgroun you should use & at the end of the call to script

# php-f pro.php &
andres descalzo
A: 

Pseudocode:

for (i=1; i<=50; i++)
  myprocess
endfor

while true
  while ( $(ps --no-headers -C myprocess|wc -l) < 50 )
    myprocess
  endwhile
endwhile

If you translate this to php and fix its flaws, it might just do what you want.

Dennis Williamson
A: 

I would go in the direction that andres suggested. Just put something like this at the top of your pro.php file...

$this_file = __FILE__;
$final_count = 50;

$processes = `ps auwx | grep "php -f $this_file"`;
$processes = explode("\n", $processes);
if (count($processes)>$final_count+3) {
        exit;
}
//... Remaining code goes here
Dooltaz
What's the 3 added to the final_count for?
Dennis Williamson
1 is for a newline at the end of the explode. When I tested this, the command saw itself through the bash script, which gave me two additional entries. You can change the number to tweak it to get exactly 50.
Dooltaz
+1  A: 

Have you tried making a PHP Daemon before?

http://kevin.vanzonneveld.net/techblog/article/create%5Fdaemons%5Fin%5Fphp/

WebDevEric
+1  A: 

Here's something in Perl I have in my library (and hey, let's be honest, I'm not going to rig this up in PHP just to give you something working in that language this moment. I'm just using what I can copy / paste).

#!/usr/bin/perl
use threads;
use Thread::Queue;

my @workers;
my $num_threads = shift;
my $dbname = shift;
my $queue = new Thread::Queue;

for (0..$num_threads-1) {
        $workers[$_] = new threads(\&worker);
                print "TEST!\n";
}

while ($_ = shift @ARGV) {
        $queue->enqueue($_);
}

sub worker() {
        while ($file = $queue->dequeue) {
                system ('./4parser.pl', $dbname, $file);
        }
}

for (0..$num_threads-1) { $queue->enqueue(undef); }
for (0..$num_threads-1) { $workers[$_]->join; }

Whenever one of those systems calls finishes up, it moves on dequeing. Oh, and damn if I know hwy I did 0..$numthreads instead of the normal my $i = 0; $i < ... idiom, but I did it that way that time.

Autocracy
I would have used fork/exec, and then wait() until one exits. Actually, I would have used Parallel::ForkManager. When I was an OpenMosix cluster admin, some users put it to good use, along with make -j n. There's also `xargs -P n` these days, as other ways to keep n processes in flight, but those are best for processing some set size of workload, not indefinite restarting.
Peter Cordes