Hi all,
Objective: My script will download a remote file upon form submission, since the file might be big, I would like to fork off a process and let the user go on with his/her life.
Example of a command:
wget -q --limit-rate=1k --tries=10 "http://helios.gsfc.nasa.gov/image_euv_press.jpg" -O /web/assets/content/image/image_euv_press.jpg
Method tried:
pcntl forking,
$pid = pcntl_fork();
if ( $pid == -1 ) { exit; } else if ( $pid ) { //We are the parent process, the pid is child pid right? return $pid; } else { // We are the child process exec($command.' > /dev/null'); // > /dev/null & posix_kill(getmypid(),9); return; }
I do get the PID but then there is a risk that the forked process becomes a zombie and since I am using nginx -> php-fpm (tested and confirmed, upon running several times alot of defunct php-fpm processes), I would have to restart the server just to eliminate the zombies, this would leave me to PID exhaustion attack? ( I am guessing)
Background process:
exec($command . ' > /dev/null &');//background process $proc = passthru ("ps aux | grep '$command'");//search for the command echo var_dump($proc); $pid = (int)(next(explode(' ',$proc)));//parse the pid (not implemented)
Question: the background process method works but it's not clean, is there a better way to fork off a process to download and get that wget command PID so I can kill it later?.
I have tried echoing $! after doing the exec just to get the PID but exec('echo $!') doesnt return anything, i think it's because every exec is a different "space"
I added '> /dev/null 2>/dev/null &' to the end of the command on my terminal it would return something like: [3] 30751, but through php exec, there is no way to capture that returned PID.
Thank you.