views:

60

answers:

2

Does anyone know of a script or program that can be used for backing up multiple websites?

Ideally, I would like the have it setup on a server where the backups will be stored.

I would like to be able to add the website login info, and it connects and creates a zip file or similar that it would then be sent back to the remote server to be saved as a backup etc...

But it would also need to be able to be set up as a cron so it backed up everyday at least?

I can find PC to Server backups that a similar, but no server to server remote backup scripts etc...

It would be heavily used, and needs to be a gui so the less techy can use it too?

Does anyone know of anything similar to what we need?

TIA

A: 
  • HTTP-Track website mirroring utility.
  • Wget and scripts
  • RSync and FTP login (or SFTP for security)
  • Git can be used for backup and has security features and networking ability.
  • 7Zip can be called from the command line to create a zip file.

In any case you will need to implement either secure FTP (SSH secured) OR a password-secured upload form. If you feel clever you might use WebDAV.

BobMcGee
Thanks! We are trying to avoid having to put something together ourselves... Something tired and tested, that may only need to be modified would be better, if its out there lol
Indigo
Any service you can find will need significant customization for you to use it in your specific case. The above options all take a lot of the heavy lifting out of the work, and can be easily scripted and automated.
BobMcGee
A: 

Here's what I would do:

  1. Put a backup generator script on each website (outputting a ZIP)
  2. Protect its access with a .htpasswd file
  3. On the backupserver, make a cron script download all the backups and store them
Daan