I was working on a fairly long script that seemed to stop running after about 20 minutes. After days of trying to figure out why, I decided to make a very simple script to see how long it would run without any complex code to confuse me. I found that the same thing was happening with this simple infinite loop. At some point between 15 and 25 minutes of running, it simply stopped. There is no error output at all. The bottom of the browser simply says "Done" and nothing else is processed. I've been over every single possible thing I could think of... set_time_limit, session.gc_maxlifetime in the php.ini as well as memory_limit and max_execution_time. The point that the script is stopped is never consistent. Sometimes it will stop at 15 minutes, sometimes 22 minutes, sometimes 17... But it's always long enough to be a PITA to test. Please, any help would be greatly appreciated. It is hosted on a 1and1 server. I contacted them and they couldn't help me... but I have a feeling they just didn't know enough.
At some point your browser times out and stops loading the page. If you want to test, open up the command line and run the code in there. The script should run indefinitely.
Do you have access to the error and access logs from the server? They are likely to give you more information on what is happening in this case.
Also - can you reproduce the problem locally? That might give you some more options as far as debugging goes.
Have you considered just running the script from the command line, eg:
php script.php
and have the script flush out a message every so often that its still running:
<?php
while (true) {
doWork();
echo "still alive...";
flush();
}
I think what you need to find out is the exact time that it stops (you can set an initial time and keep dumping out the current time minus initial). There is something on the server side that is stopping the file. Also, consider doing an ini_get to check to make sure the execution time is actually 0. If you want, set the time limit to 30 and then EVERY loop you make, continue setting at 30. Every time you call set_time_limit, the counter resets and this might allow you to bypass the actual limits. If this still isn't working, there is something on 1and1's servers that might kill the script.
Also, did you try the ignore_user_abort?
I appreciate everyone's comments. Especially James Hartig's, you were very helpful and sent me on the right path. I still don't know what the problem was. I got it to run on the server with using SSH, just by using the exec() command as well as the ignore_user_abort(). But it would still time out. So, I just had to break it into small pieces that will run for only about 2 minutes each, and use session variables/arrays to store where I left off. I'm glad to be done with this fairly simple project now, and am supremely pissed at 1and1. Oh well...
This is probably an old post that I am bumping but the year isn't included in the comments date.
I use 1and1 as a host and I have a php script that takes a while to run and is constantly flushing out some data to the browser. When I run it locally (within MAMP) it runs fine (it completes the script but it doesn't flush?) but on 1and1's server it stops after only about 3 minutes. Between every flush my variables are reset and reused so I don't think I am running out of memory. Anybody have any new ideas about this?
Thanks
I think this is caused by some process monitor killing off "zombie processes" in order to allow resources for other users.
Run the exec using "2>&1" to log anything including stderr.
In my output I managed to catch this:
...
script.sh: line 4: 15932 Killed php5-cli -d max_execution_time=0 -d memory_limit=128M myscript.php
So something (an external force, not PHP itself) is killing my process!
I use IdWebSpace which is excellent BTW but I think most shared hosting providers impose this resource/process control mechanism just to be sane.