tags:

views:

35

answers:

1

I have a script that runs on the command line, called by a crontab. In the last five attempts to run this script, it has died partway through an echo statement - the cron output only shows part of the intended echo output, and nothing after that is executed. This is a long-running script, being run through php-cli, which performs file management tasks.

Is there anything that might cause a script to die during an echo statement, without generating any other output, or a way to troubleshoot or catch potential errors during echo?

I am not sure what code I can post that will help, as this is a rather comprehensive script involving a few libraries. The echo statements are fairly simple - echo('Checking file...') might get put in the log as "Che" then no more output.

A: 

First, I would enable error-logging. Secondly, since it's PHP, it may be that you initialize a variable with a function inside the echo function, but the function exits with a fatal error (common with the PHP+I/O operations) and thus the whole script just dies. Turn on error-logging, see which line causes the headaches.

Without seeing any code this is my best shot.

methode
I do have error logging enabled, E_ALL, but I don't get any error messages from PHP. I also have stout and stderr piped to a logfile, but both just show the partial echo statement.
Wige
and how does the line where the script dies look like?
methode
echo "Checking zip file..."; - gives the output 'Che'
Wige
Then it's an I/O failure or insufficient memory, at least these are my best guesses.Would you try to surround your I/O operations and resource heavy operations in try/catch statements and maybe even setting your own error handler?
methode
I do, in addition to the function specific try/catch structure, I also have a try/catch around the entire logic of the script, which does not get triggered when this problem occurs.
Wige