I am using PHP script for some backend tasks(not web) and I found sometime curl_exec will terminate the executing of script without outputing any errors. I want my script to run in a loop forever and any idea about this?
Make sure that you have error reporting set, put this on top of your php file:
ini_set('display_errors', true);
error_reporting(E_ALL);
Also you can extend the script's execution time:
ini_set('max_execution_time', 50000);
That could easily be a timeout problem.
You can set the curl timeout (for fetching a website) e.g. to 60 seconds via
curl_setopt($ch, CURLOPT_TIMEOUT, 60);
Then you need to set the PHP timeout to something higher than 60 seconds, e.g.
set_time_limit(90);
This timeout is real time on Windows and CPU time on Unix, therefore you would need much less in Unix. It is important to set the PHP timeout within your loop, otherwise it's the overall limit and would never be enough for your endless loop.
do {
set_time_limit(90);
// curl stuff
} while (true);
In your php, you can set the CURLOPT_VERBOSE variable to trace the issue :
curl_setopt($curl, CURLOPT_VERBOSE, TRUE);
This then logs to STDERR, or to the file specified using CURLOPT_STDERR:
curl_setopt($curl, CURLOPT_STDERR, './path/to/file.log');
From the command line, you can use the following switches:
* --verbose to report more info to the command line
* --trace <file> or --trace-ascii <file> to trace to a file
You can use --trace-time to prepend time stamps to verbose/file outputs
If you want the process to be able to start and run infinitely, you need to:
set_time_limit(0);
Otherwise, as soon as your script has run for 30 seconds (or whatever max_execution_time specifies in php.ini) the script will exit with an error. This interrupts anything the script might be doing, e.g. getting a return from curl_exec()
.
A caution regarding using PHP to write a daemon (which is essentially what it seems like you are doing) can be found here.