views:

73

answers:

4

Hi guys is there any backup software that can take periodic backups of online website folders and store them offline on a local system. Need something robust and would be nice if theres something free that can do the job :)


Thanks for the links - I have ftp access and its my website and its a bit of a documents sharing website with user uploads and I would like to maintain a backup of teh files uploaded from time to time on the website on a periodic basis. Just want to automate this process. My local system is windows based though.

+1  A: 

I think you need to post that question at Server Fault.

Dave Arkell
A: 

If you are referring to a website that will be accessed by you from your browser (rather than as the administrator of the site) you should check out WGet. And, if you need to use WGet from a Windows system, checkout Cygwin

nik
A: 

If you don't have shell access at the site, you can use wget:

#!/bin/bash
export BCKDIR=`date -u +"%Y%m%dT%H%M%SZ"`
wget -m -np -P $BCKDIR http://www.example.com/path/to/dir

wget options:

  • -m - Mirror everything, follow links
  • -np - Don't access parent directories (avoids downloading the whole site)
  • -P - Store files below $BCKDIR

If you have shell access, you can use rsync. One way to do it, is to have this loop running in a screen(1) session with automatic login using ssh-agent:

#!/bin/bash
while :; do
    export BCKDIR=`date -u +"%Y%m%dT%H%M%SZ"`
    rsync -az user@hostname:/path/to/dir $BCKDIR
    sleep 86400 # Sleep 24 hours
done

Not sure what OS you're using, but this should run fine under *NIX. And for MS Windows, there's Cygwin.

sunny256
A: 

If you have access to the webserver, a cronjob which emails or ftps out the archive would do the job.

sybreon