tags:

views:

3236

answers:

9

I'm attempting to do an AJAX call (via JQuery) that will initiate a fairly long process. I'd like the script to simply send a response indicating that the process has started, but JQuery won't return the response until the PHP script is done running.

I've tried this with a "close" header (below), and also with output buffering; neither seems to work. Any guesses? or is this something I need to do in JQuery?

<?php

echo( "We'll email you as soon as this is done." );

header( "Connection: Close" );

// do some stuff that will take a while

mail( '[email protected]', "okay I'm done", 'Yup, all done.' );

?>
A: 

did you flush your output buffer with ob_flush() and it didn't work?

Vinko Vrsalovic
correct. Jquery still sat there doing nothing until the PHP script had run its course.
Cowboy_X
A: 

You could try to do multithreading.

you could whip up a script that makes a system call ( using shell_exec ) that calls the php binary with the script to do your work as the parameter. But I don't think that is the most secure way. Maybe you can thighten stuff up by chrooting the php process and other stuff

Alternatively, there's a class at phpclasses that do that http://www.phpclasses.org/browse/package/3953.html. But I don't know the specifics of the implementation

paan
Liam
A: 

Your problem can be solved by doing some parallel programming in php. i asked a question about it a few weeks ago here:

http://stackoverflow.com/questions/70855/how-can-one-use-multi-threading-in-php-applications

and got great answers. I liked one in particular very much. The writer made a reference to this tutorial which can actually solve your problem very well as i have used it already to deal with a similar problem that came up a couple of days ago.

Good luck!!

Steve Obbayi
+2  A: 

Assuming you have a Linux server and root access, try this. It is the simplest solution I have found.

Create a new directory for the following files and give it full permissions. (We can make it more secure later.)

mkdir test
chmod -R 777 test
cd test

Put this in a file called bgping.

echo starting bgping
ping -c 15 www.google.com > dump.txt &
echo ending bgping

Note the &. The ping command will run in the background while the current process moves on to the echo command. It will ping www.google.com 15 times, which will take about 15 seconds.

Make it executable.

chmod 777 bgping

Put this in a file called bgtest.php.

<?php

echo "start bgtest.php\n";
exec('./bgping', $output, $result)."\n";
echo "output:".print_r($output,true)."\n";
echo "result:".print_r($result,true)."\n";
echo "end bgtest.php\n";

?>

When you request bgtest.php in your browser, you should get the following response quickly, without waiting about 15 seconds for the ping command to complete.

start bgtest.php
output:Array
(
    [0] => starting bgping
    [1] => ending bgping
)

result:0
end bgtest.php

The ping command should now be running on the server. Instead of the ping command, you could run a PHP script:

php -n -f largejob.php > dump.txt &

Hope this helps!

Liam
A: 

Ok, so basically the way jQuery does the XHR request, even the ob_flush method will not work because you are unable to run a function on each onreadystatechange. jQuery checks the state, then chooses the proper actions to take (complete,error,success,timeout). And although I was unable to find a reference, I recall hearing that this does not work with all XHR implementations. A method that I believe should work for you is a cross between the ob_flush and forever-frame polling.

<?php
 function wrap($str)
 {
  return "<script>{$str}</script>";
 };

 ob_start(); // begin buffering output
 echo wrap("console.log('test1');");
 ob_flush(); // push current buffer
 flush(); // this flush actually pushed to the browser
 $t = time();
 while($t > (time() - 3)) {} // wait 3 seconds
 echo wrap("console.log('test2');");
?>

<html>
 <body>
  <iframe src="ob.php"></iframe>
 </body>
</html>

And because the scripts are executed inline, as the buffers are flushed, you get execution. To make this useful, change the console.log to a callback method defined in you main script setup to receive data and act on it. Hope this helps. Cheers, Morgan.

Morgan ARR Allen
A: 

An alternative solution is to add the job to a queue and make a cron script which checks for new jobs and runs them.

I had to do it that way recently to circumvent limits imposed by a shared host - exec() et al was disabled for PHP run by the webserver but could run in a shell script.

Ole J. Helgesen
+8  A: 

The following page contains instructions on how to close the connection without ending the PHP script: http://php.net/manual/en/features.connection-handling.php

Supposedly it requires a bit more than sending a close header.

Joeri Sebrechts
yup, this did the trick: http://www.php.net/manual/en/features.connection-handling.php#71172
Cowboy_X
+3  A: 

It's necessary to send these 2 headers:

Connection: close
Content-Length: n (n = size of output in bytes )

Since you need know the size of your output, you'll need to buffer your output, then flush it to the browser:

// buffer all upcoming output
ob_start();
echo "We'll email you as soon as this is done.";

// get the size of the output
$size = ob_get_length();

// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: close');

// flush all output
ob_end_flush();
ob_flush();
flush();

/******** background process starts here ********/

For more details, visit http://www.zulius.com/how-to/close-browser-connection-continue-execution

Cornelius