views:

411

answers:

3

Hi all,

Objective: My script will download a remote file upon form submission, since the file might be big, I would like to fork off a process and let the user go on with his/her life.

Example of a command:

wget -q --limit-rate=1k --tries=10 "http://helios.gsfc.nasa.gov/image_euv_press.jpg" -O /web/assets/content/image/image_euv_press.jpg

Method tried:

  1. pcntl forking,

        $pid = pcntl_fork();
    
    
    
    if ( $pid == -1 ) {
        exit;
    } else if ( $pid ) {
        //We are the parent process, the pid is child pid right?
    
    
        return $pid;
    } else {
        // We are the child process
    
    
        exec($command.' > /dev/null');
        // > /dev/null &
    
    
        posix_kill(getmypid(),9);
        return;
    }
    

I do get the PID but then there is a risk that the forked process becomes a zombie and since I am using nginx -> php-fpm (tested and confirmed, upon running several times alot of defunct php-fpm processes), I would have to restart the server just to eliminate the zombies, this would leave me to PID exhaustion attack? ( I am guessing)

  1. Background process:

            exec($command . ' > /dev/null &');//background process
        $proc = passthru ("ps aux | grep '$command'");//search for the command
        echo var_dump($proc);
        $pid = (int)(next(explode(' ',$proc)));//parse the pid (not implemented)
    

Question: the background process method works but it's not clean, is there a better way to fork off a process to download and get that wget command PID so I can kill it later?.

I have tried echoing $! after doing the exec just to get the PID but exec('echo $!') doesnt return anything, i think it's because every exec is a different "space"

I added '> /dev/null 2>/dev/null &' to the end of the command on my terminal it would return something like: [3] 30751, but through php exec, there is no way to capture that returned PID.

Thank you.

A: 

Try the following command:

exec("ps -C $command -o pid=", $pids);

But I recommend you to use Zend Server Job Queue, which exists for these objectives.

Sagi
thanks! That line works great!
+2  A: 

While not a direct answer, the following link might help you get it done:

As an alternative to PHP's native pctl functions, consider using Gearman:

Gearman provides a generic application framework to farm out work to other machines or processes that are better suited to do the work. It allows you to do work in parallel, to load balance processing, and to call functions between languages. It can be used in a variety of applications, from high-availability web sites to the transport of database replication events.

Gordon
A: 

Try adding the "echo $!" to the same execution flow as the launched background process.

I.e. something like this: shell_exec("$command & echo $!");

Paul