views:

665

answers:

4

Hello,

I have a site where I download data from one site using cUrl a then generate a image from it, which is stored on the server and displayed on other websites.

I've got everything working perfectly, except from a cron job.

Whenever I run this script it stops after a while (it's more than 30 secs though, more like 2-3 minutes, it's hosted on GoDaddy) which is still not enough. I tried to multi-thread cUrl but it still takes more time than I'm given.

So far the solution for me was manually updating it by specifying limit and offset and reloading like 20 pages every day in browser (I've got about 12000 items to update). This is very annoying as you might imagine.

Only solution I can think of is making several cron jobs, running at 00:01, 00:02, etc. each for like 500 items. But I am not sure I can run that many cron jobs on GoDaddy.

Can anyone give me an advice on this?

Thanks

A: 

you should set the time limit to 0, set_time_limit function would help.

code snippet:

set_time_limit(0);

I am not sure if you are allowed to do that on GoDaddy you have to check. if they are running in safe mode you won't have access to that and it might be the case.

The documentation state that :

This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.

You seem to be safe

Do you run PHP in safe mode on your Linux hosting servers? No, we do not run PHP in safe mode on our Linux hosting servers. For more information on safe mode, please visit www.php.net.

RageZ
I am running a Linux shared hosting, will try to set the timelimit.
Smaug
doesn't work, it stops around 400-500 seconds again
Smaug
A: 
  1. Try to add logging to your job definition in crontab. For example:

    1 0 * * *  /home/user/myscript.php > /var/log/myscript.log
    

    It would help you to define whether there is fair explanation.

  2. I also always recommend putting code into the beginning of the script to be sure you get all errors:

    error_reporting(E_ALL);
    ini_set('display_errors', 'On');
    
  3. Put exhaustive logging into your script, it always helps. I would also recommend using Zend_Log or similar lib (PEAR_Log for instance). Also find out whether you could structure your code in a way that gives you more logging possibilities (abstract advice, but anyway).
altern
You can't set a crontab or use most PEAR packages on GoDaddy shared hosting. Also, GoDaddy shared hosting has error reporting in PHP set to E_ALL by default.
mattbasta
@mattbasta: You even can't take specific files from PEAR and put into 'classes' folder? I never used GoDaddy, but I assume it's average hosting provider with pretty common hosting rules, so I posted suggestions that might help in average case. Don't know why answer got vote down.
altern
A: 

If you are using GoDaddy shared hosting, see How to setup Cron Job in GoDaddy shared hosting.

NeoCambell
A: 

You could use a tool like WebCron:

http://barebonescms.com/documentation/webcron/

Basically, it fakes 'cron' on hosts that don't provide access to a task scheduler. All code resides on your server, thus no need for a third-party. In your case, you could use GoDaddy's interface to run WebCron once every minute. I'd do it that way because it becomes possible to change hosts more easily later on.

For your needs, it sounds like you might want to find a way to break up the overall task into smaller tasks that each take less time and then run the task multiple times. WebCron makes creating a state-engine fairly easy. Load the state, run for 'x' seconds, save the state. Have a state where it checks to see if it should start the process (e.g. the next day rolls around). That way it doesn't constantly run.

WebCron