views:

353

answers:

5

FastCGI servers, for example, impose an execution time limit on PHP scripts which cannot be altered using set_time_limit() in PHP. IIS does this too I believe.

I wrote an import script for a PHP application that works well under mod_php but fails under FastCGI (mod_fcgid) because the script is killed after a certain number of seconds. I don't yet know of a way of detecting what your time limit is in this case, and haven't decided how I'm going to get around it. Doing it in small chunks with redirects seems like one kludge, but how?

What techniques would you use when coding a long-running task such as an import or export task, where an individual PHP script may be terminated by the server after a certain number of seconds?

Please assume you're creating a portable script, so you don't necessarily know whether PHP will eventually be run under mod_php, FastCGI or IIS or whether a maximum execution time is enforced at the server level. That probably also rules out shell-scripts, etc.

+4  A: 

Use the PHP command line interface which is not subject to script time limits imposed by web servers. If you need to automate execution of your script, you can schedule it with cron.

Asaph
+1 for using a cron. Long running tasks shouldn't be run using a web script
webdestroya
I dont understand how can this push back the time limit declared in php.ini, could you be more specific ? thanks
Benoit
@Benoit I was referring to time limits imposed by the server (Apache, IIS, etc) or the server module (FastCGI/fcgid). When running from the command line, you wouldn't have those particular limits, and the only limits would be ones that PHP itself has control over. The downside of running from command line is that it's not so portable - you'd have to set it up differently for Windows servers, or alter it according to the path to your PHP executable, etc. Still, it's one option.
thomasrutter
A: 

Could you spawn faux processes with exec() and the PHP command line?

I'm not sure how you'd measure when to spawn a new one, but might be worth looking into.

alex
+1  A: 

EDIT ----

I just re-read your question. On unix/linux platform, you can spawn background processes from PHP. The background process itself can be an instance of PHP CLI, a shell script, or an executable such as ffmpeg. Its described here: Running a Background Process from PHP on Linux

On unix/linux, cron jobs is an alternate option. You can schedule cron jobs to start at specific time but configuring the job and passing parameters is nearly impossible. And you cannot schedule jobs from within PHP itself. Scheduling a job with this wget command (as suggested by some people) is futile:

wget http://website.com/time-consuming-script.php?parameter=whatever

The command should be:

php time-consuming-script.php parameter=whatever

On windows, you neither have the background process option, nor cron job service. However, there are workaround for starting background processes (psexec utility) and then there is the "task scheduler" service and the at command.

Salman A
A: 

What you're really talking about is job queuing. That is the practice of running PHP code asynchronously from the front end request. There are two primary ways of doing it in PHP. One is to use a program called Gearman the other is to use the Zend Server Job Queue, which I personally am more familiar with. I have a blog post on how you can do it called Do you Queue. I have found that the implementation I have there is immensely easy to use.

What you might also want to try is to set max_execution_time to 0 prior to executing your logic.

Kevin Schroeder
A: 

Doing it in small chunks with redirects seems like one kludge, but how?

That's exactly how I handled a full forum database backup (phpBB) when the built-in export mechanism started hitting the max_execution_time limit.

I did it one table at a time, and for the big tables in chunks of 5000 rows. (It turned out that the limiting factor in the whole process wasn't the execution time on the export, but actually the file size that phpmyadmin could handle on the import.)

After each chunk of exporting, I returned a page with a meta refresh tag in the header, redirecting the script back to itself with the next block's table number and start row in the query string.

<?php if(!$all_done){
    $new_url=$_SERVER['PHP_SELF'].'?tablecount='.$count;
    if(!$tabledone && ""!=$start_row && null!=$start_row){
        $new_url.="&startrow=".$start_row;
    } else {
        $new_url.="&startrow=0";
    }
    echo('<meta http-equiv="refresh" content="0.5;url='.$new_url.'" />');
} ?>

The counters were so I could iterate through an array of table names that I'd retrieved with SHOW TABLES.

Before I had the wits to cull the gigantic word-match table (which phpBB can rebuild by itself) from the export, this back-up script would take over half an hour to complete.

Ed Daniel