views:

630

answers:

8

I was working on a fairly long script that seemed to stop running after about 20 minutes. After days of trying to figure out why, I decided to make a very simple script to see how long it would run without any complex code to confuse me. I found that the same thing was happening with this simple infinite loop. At some point between 15 and 25 minutes of running, it simply stopped. There is no error output at all. The bottom of the browser simply says "Done" and nothing else is processed. I've been over every single possible thing I could think of... set_time_limit, session.gc_maxlifetime in the php.ini as well as memory_limit and max_execution_time. The point that the script is stopped is never consistent. Sometimes it will stop at 15 minutes, sometimes 22 minutes, sometimes 17... But it's always long enough to be a PITA to test. Please, any help would be greatly appreciated. It is hosted on a 1and1 server. I contacted them and they couldn't help me... but I have a feeling they just didn't know enough.

+2  A: 

At some point your browser times out and stops loading the page. If you want to test, open up the command line and run the code in there. The script should run indefinitely.

James Hartig
That's what I was thinking! Isn't there any way to prevent the browser from doing that?And color me a noob, but my 1and1 hosting package is the cheapest which doesn't support SSH... and as far as I know, that's the only way to access the php via command line, isn't it?
RobHardgood
Yes that is basically the only way. Can I ask why you want to indefinitely load? Most browsers (besides mobile) should be fine as long as you are constantly sending them data.
James Hartig
Well my infinite loop was just for testing. The script I'm actually trying to fix needs to run for about 30 or 40 minutes, but it keeps stopping after 20. I was trying to figure out why with this one. It does echo a short bit of text every few seconds. Is that not the kind of data the browser needs to keep the session open?
RobHardgood
Its hard to tell. The information here is very vague. Your problem might still be with your timeout limit in PHP, are you sure that it is still running after it times out? It could be that your hosting company is preventing you from running a script over 30 seconds. Some hosts do this to prevent CPU/server lockups.Read through http://us.php.net/manual/en/features.connection-handling.php and consider writing to a file when the user closes the connection and making sure that php is still running.
James Hartig
Yeah, I've been through all the timeout limits... I contacted 1and1 and they told me that there aren't any time limits like that that they impose. I don't think it is still running after it times out. That's the problem... My original script added data to a database, and once it stopped no more data showed up in the database. (I eliminated MySQL as the problem already, btw)What you said about writing to a file and making sure PHP is still running... What exactly would that do? And... how would I do it? (pardon my noobishness here...) Thanks
RobHardgood
"my 1and1 hosting package is the cheapest which doesn't support SSH" ... why the _@#$%_ do people pay for crap like this? You realize you can get a VPS for as little as $10/mo ? Heck, I'll give you a shell account on one of my boxes for $5/mo.
hobodave
Oh, I just read your link and I think I understand what you mean now... Write the output of the script to a file so I can still see if it's running, and tell it to ignore an abort?
RobHardgood
hobodave, because it's only $4 a month with a free domain... It has pretty much everything else I need though :-p
RobHardgood
What you can do is easy, do ignore_user_abort and you should still see your MySQL being populated :)
James Hartig
Thanks so much, James! This looks quite promising. It'll take a while to test, but I'll post back here later if it works. :D
RobHardgood
Gah! It didn't work! So it's just stopping itself. On the server. For no reason! wtf -_-
RobHardgood
Why don't you download Apache and setup PHP on your own machine for debugging?
Chad Okere
A: 

Do you have access to the error and access logs from the server? They are likely to give you more information on what is happening in this case.

Also - can you reproduce the problem locally? That might give you some more options as far as debugging goes.

objectified
Nah, I don't have access to the logs. That's one thing that really makes me mad. I made a custom PHP error log csv file, but no useful PHP errors :-/I haven't tested it locally since it takes so long to find out. I figure if it does run properly, then it's some obscure server setting I have no idea about, or if it does stop, then it's some obscure browser problem I have no idea about. Either way it doesn't help much. I'm about to try James's advice above, though, that seems useful. Any other ideas though in the meantime?
RobHardgood
Well, running it locally would allow you to examine the server logs. Does it occur in only one particular browser? I'm suspecting it doesn't, and I have the feeling your issue is related to memory usage or maybe causing PHP to segfault (which you should be able to see in the server logs)
objectified
What you can do is specify in your script doing ini_set to:1) log_errors to on2) error_log you can set to some file in that dir, like error_log = ./php.logThen run your script, then pull up FTP (which im assuming you have) and check out the file.
James Hartig
+1  A: 

Have you considered just running the script from the command line, eg:

php script.php

and have the script flush out a message every so often that its still running:

<?php

while (true) {
  doWork();
  echo "still alive...";
  flush();
}
AndrewMurphy
I'm trying this now... Thanks
RobHardgood
Yeah, this didn't work either
RobHardgood
Try bumping up your max execution time using ini_setini_set('max_execution_time', 300); //5min
AndrewMurphy
A: 

I think what you need to find out is the exact time that it stops (you can set an initial time and keep dumping out the current time minus initial). There is something on the server side that is stopping the file. Also, consider doing an ini_get to check to make sure the execution time is actually 0. If you want, set the time limit to 30 and then EVERY loop you make, continue setting at 30. Every time you call set_time_limit, the counter resets and this might allow you to bypass the actual limits. If this still isn't working, there is something on 1and1's servers that might kill the script.

Also, did you try the ignore_user_abort?

James Hartig
A: 

I appreciate everyone's comments. Especially James Hartig's, you were very helpful and sent me on the right path. I still don't know what the problem was. I got it to run on the server with using SSH, just by using the exec() command as well as the ignore_user_abort(). But it would still time out. So, I just had to break it into small pieces that will run for only about 2 minutes each, and use session variables/arrays to store where I left off. I'm glad to be done with this fairly simple project now, and am supremely pissed at 1and1. Oh well...

RobHardgood
A: 

This is probably an old post that I am bumping but the year isn't included in the comments date.

I use 1and1 as a host and I have a php script that takes a while to run and is constantly flushing out some data to the browser. When I run it locally (within MAMP) it runs fine (it completes the script but it doesn't flush?) but on 1and1's server it stops after only about 3 minutes. Between every flush my variables are reset and reused so I don't think I am running out of memory. Anybody have any new ideas about this?

Thanks

Devin Crossman
A: 

I think this is caused by some process monitor killing off "zombie processes" in order to allow resources for other users.

Run the exec using "2>&1" to log anything including stderr.

In my output I managed to catch this:

...
script.sh: line 4: 15932 Killed                  php5-cli -d max_execution_time=0 -d memory_limit=128M myscript.php

So something (an external force, not PHP itself) is killing my process!

I use IdWebSpace which is excellent BTW but I think most shared hosting providers impose this resource/process control mechanism just to be sane.

Hendy Irawan
A: 

have the same problem. Using 1and1 as well. what a coincidence! ...

m1sfit