views:

49

answers:

2

I'm trying to make a sort of PHP bot. The idea is to have to php files, named a.php and b.php. a.php does something, then sleeps 30 seconds, calls b.php, b.php ends the Http request, does some processing, and then calls a.php, which ends the Http request, and so on.

Only problem now is how to end the Http reqest, made using cURL. Ive tried this code below:

<?php
ob_end_clean();
header("Connection: close");
ignore_user_abort(); // optional
ob_start();
echo ('Text the user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();     // Will not work
flush();            // Unless both are called !

// At this point, the browser has closed connection to the web server

// Do processing here
echo('Text user will never see');

Slight problem is that it doesn't work, and I actually see "Text user will never see". I've tried cron jobs and such, but host doesn't allow it. I can't sent the script timeout limit either. So my only option is to create repeating php scripts. So how would I send the Http request?

+1  A: 

Based on the new understanding of your problem. You are creating a system that checks a remote URL every 30 seconds to monitor a fragment of content. For this I recommend a CRON which can either be server based: http://en.wikipedia.org/wiki/Cron or web based if your host does not permit it: http://www.webbasedcron.com/ (example).

esryl
Exit closes the script. What I'm trying to do is to end the connection between the two scripts, so one script will not have to wait for another to finish, or I'll be creating an infinite loop. I'm not trying to close the script, just end the output of it, and let the browser know to close its connection while the script continues processing. I can't use an infinite while loop/goto loop either, php scripts have a 30 second timeout limit set by my host.
LostInTheCode
I came across a similar problem using Google App Engine, which only has a 30 second window. Now transitioning from script to script will probably be counted in the same request. Have you thought about using: http://en.wikipedia.org/wiki/Meta_refresh to navigate between scripts.
esryl
Well, that requires some sort of client to be running this, to reload the page every now and then. I'm trying to create an automated system to automatically log a number contained on a site every 30 seconds, only from my server.
LostInTheCode
so your scraping a number (and storing it?) every 30 seconds. you need a CRON job, either on your server or a web based service.
esryl
That's exactly what I need. A web based cron service. Thanks for the help.
LostInTheCode
A: 

PHP scripts in this case run in the context if web server request, therefore you can't stop talking to the web connection and then continue doing stuff, which is what I think you're attempting to do with the connection close.

The reason you're seeing the output at the end is because at the end of a script PHP will call an implicit flush (see ob_implicit_flush in the manual), but you close the connection to the browser by ending the PHP script.

Ways around this:

You might be able to use set_time_limit to extend the execution limit. DO NOT USE ZERO. It's tempting to say "take all the time you need" on a post-process script, but that way lies madness and bitter sysadmins, plus remember you're still running on curl's timeout stopwatch (though you can extend that as an option). set_time_limit(5) will give you five more seconds, so doing that periodically will allow you to do your post-processing but - if you're careful - still protect you from infinite loops. Infinate loops with no execute limits in the context of apache requests are also likely to make you unpopular with your sysadmin.

It might be possible to build a shell script in your application, save it to disk, execute that in the background and have it delete itself after. That way it will run outside the web-request context, and if the script still exists when you next do the request, you can know that the other processing is still happening. Be really careful about things that might take longer than your gap between executions, as that way leads to sorrow and more bitter sysadmins. This course of action would get you thrown off my hosting environment if you did it without talking to me about it first, though, as it's a terrible hack with a myriad of possible security issues.

But you appear to be attempting to run a regular batch process on a system where they don't want you to do that - or they'd have given you access to cron - so your best and most reliable method is to find a host that actually supports the thing you're trying to do.

Aquarion
Great set of ideas. I'll be sure to look into all of them.
LostInTheCode