tags:

views:

74

answers:

2

I want to setup a automated backup via PHP, that using http/s I can "POST" a zip file request to another server and send over a large .zip file , basically, I want to backup an entire site (and its database) and have a cron peridocally transmit the file over via a http/s. somethiling like

wget http://www.thissite.com/cron_backup.php?dest=www.othersite.com&file=backup.zip

The appropriate authentication security can be added afterwords....

I prefer http/s because this other site has limited use of ftp and is on a windows box. So the I figure the sure way to communicate with it is via http/s .. the other end would have a correspondign php script that would store the file.

this process needs to be completely programmatic (ie. Flash uploaders will not work, as this needs a browser to work, this script will run from a shell session)//

Are there any generalized PHP libraries or functions that help with this sort of thing? I'm aware of the PHP script timeout issues, but I can typically alter php.ini to minimize this.

A: 

I'd personally not use wget, and just run from the shell directly for this.

Your php script would be called from cron like this: /usr/local/bin/php /your/script/location.php args here if you want

This way you don't have to worry about yet another program to handle things (wget) if your settings are the same at each run then just put them in a config file or directly into the PHP script.

Timeout can be handled by this, makes PHP script run an unlimited amount of time.

set_time_limit(0);

Not sure what libraries you're using, but look into CRUL to do the POST, should work fine.

I think the biggest issues that would come up would be more sever related and less PHP/script related, i.e make sure you got the bandwidth for it, and that your PHP script CAN connect to an outside server.

Viper_Sb
A: 

If its at all possible I'd stay away from doing large transfers over HTTP. FTP is far from ideal too - but for very different reasons.

Yes it is possible to do this via ftp, http and https using curl - but this does not really solve any of the problems. HTTP is optimized around sending relatively small files in erlatively short periods of time - when you stray away from that you will end up undermining a lot of the optimization that's applied to webservers (e.g. if you've got a setting for maxrequestsperchild you could be artificially extending the life of processes which should have stopped, and there's the interaction between LimitRequest* settings and max_file_size, not to mention various timeouts and the other limit settings in Apache).

A far more sensible solution is to use rsync over ssh for content/code backups and the appropriate database replication method for the DBMS you are using - e.g. mysql replication.

symcbean
I suspected as much, while I agree but the issue of avoiding http for large file transfers when you have control over both ends of the environment. Using rsync over ssh is not applicable because of the Windows box receiving the file. Configuration wise I have limited control over an internal Windows corporate server, yet still need to get a large 30MB file over there.. Its likley that IIS conneciton limits will come into play, but I also have seen this done on other sites, and on a fast connection its typically not too bad.