tags:

views:

1238

answers:

2

I'm using Ubuntu on the server and I'm using Putty to access. I want to create cronjobs for my php site. How can I do this?

+1  A: 

I'm asssuming you want to backup your PHP site? Edit the crontab using:

crontab -e

This will start up an instance of vi in which you can edit the crontab, press i for insert mode. You then need to put in the information for when the cron entry will run and the command to run at that time, e.g.:

30 10 * * * tar -zcvf ./myphpsite.tar.gz /var/www/phpsite

So the command above will tar gzip your phpsite in /var/www/phpsite at 10:30pm every day. Exit and quit vi with :wq

See this for further reference:

http://www.adminschoice.com/docs/crontab.htm

Jon
+2  A: 

If you mean that you want your php site to do some regular tasks, there are two possible ways.

1) You use cron to pull a certain page regularly. You can do this with a text-based browser, e.g. lynx. You pull your script like this:

* * * * * /usr/bin/lynx http://yourhost.com/cron.php -dump > /dev/null

(This will call it every minute. That way you can build your own schedule inside your application)

2) You call your script with the command line php interpreter:

* * * * * /usr/bin/php /path/to/cron.php > /dev/null

Generally solution two is better. However you will need access to the box. The cron in solution one can be triggered from a different host, if you cannot install crons on the host.

Also beware of a common pitfall using the command line version of php. On debian (and potentially other systems) there may be seperate php.ini files for cgi, cli and mod_php. If you have customized your configuration make sure that the command line php is using the correct one. You can test this with:

/usr/bin/php -i | less

In response to the comment by dimo i made some benchmarks. I called a simple local php script (which just echos "test") 1000 times with lynx, wget and php-cli:

kbsilver:temp kbeyer$ time . wget.sh

real 0m14.223s user 0m2.906s sys 0m6.335s

(Command: wget -O /dev/null "localhost/test.php"; 2> /dev/null)

kbsilver:temp kbeyer$ time . lynx.sh

real 0m26.511s user 0m5.789s sys 0m9.467s

(Command: lynx -dump "localhost/test.php"; > /dev/null)

kbsilver:temp kbeyer$ time . php_cli.sh

real 0m54.617s user 0m28.704s sys 0m18.403s

(Command: /opt/local/bin/php /www/htdocs/test.php > /dev/null)

Server is lighttpd, php(fastcgi) with apc (on Mac OS X).

It turns out that indeed wget is the best tool for the job regarding speed.

So the result of php-cli is not that suprising as the other methods reuse an already running php thread with opcode cache.

So the only real advantage of using php-cli is security as the script will not be available from outside as you can put it outside the docroot.

(This test is obviously not 100% accurate, but the differences are quite obvious in my opinion)

kbeyer
I imagine wget is a better tool for this than lynx -dump
dimo414