views:

38

answers:

3

I have created an API that is used on several sites. The client sites call the server with either file_get_contents or curl and accesses a url of ours like "http://www.myserver.com/api/record.php?some=thing". The client makes the call and then waits for the API to respond before processing the rest of the page. However, some of the time no output from the API is needed by the client.

I'm wondering if there's a way for the server to output something so that the client doesn't wait for it while the server process is executing what it needs to do. Something like 'page done', so the client doesn't wait for it to load anymore, but the server script is still running for a second while it inserts database entries.

I've read that the process control functions are for CLI use only and should not be used in a web server environment, so spawning a new process is out of the question.

Any suggestions? I know making the call asynchronously with javascript would work, but we've found PHP accesses to be must more reliable than JS.

Thanks so much!

A: 

You can do it like this:

ob_start();

//create and output any output to be sent to the browser.

// get the size of the output
$size = ob_get_length();

// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: close');

// flush all output
ob_end_flush();
ob_flush();
flush();

//do any post-processing here.
timdev
A quick test shows that a browser nor apache will not close the connection and stop loading, based on these lines, until the script finishes, so presumably curl and file_get_contents will also wait until the connection is actually closed.
mvds
ps. the downvote was not mine, I think the idea could/should be able to work.
mvds
It was mine. `Connection: close` only tells the client that the connection is to be closed after the request body is sent (the alternative being keep-alive, which would keep the connection open in case the client wanted to send another request).
Artefacto
I could have sworn this worked (due to the explicit content-length header), but you're right, it doesn't.
timdev
Darn, I was looking for an easy solution like this. I'm gonna try the others and report back in a couple days. Thanks for everyone's input!
Shane N
A: 

edit: thinking again, you'd have no benefit at all from making an async request. Just make a normal request with fsockopen and kill it after fwriteing the headers has finished.

You can just make a HEAD request to signal you don't want the body of the response – this will save a few trips.

In principle, you could make an async request. After you write the headers, you'd have to make sure they were properly sent. At this point, you could abort the connection and rely on ignore_user_abort server-side. (I'm not sure this will work with all webservers, though, perhaps it's possible the request would never reach PHP after such a early user abort).

Artefacto
A: 

The connection between your script and your api is the tricky thing; you must terminate it for the curl or file_get_contents to return. (you may look into the asynchronous curl_multi_* functions or do your own socket handling like artefacto suggested)

One option would be to put the backend handling in a backgrounded process, like so:

$desc = array(); // fd descriptors
$proc = proc_open("/bin/sleep 60 &",$desc,$pipes);

where sleep is just for the proof of concept. It lives happily on, after the connection is finished. The downside is that if you have any fd's connected, the child process will die with the parent, so you must shove the data in through commandline or $env (5th arg to proc_open)

mvds