views:

274

answers:

2

Our website relies on images from one of our manufacturers. The image directories are massive and getting them via FTP is an all day job. Now that we've downloaded the entire directory, we'd like to be able to periodically download files and directories that are new, or have been changed since the last time we downloaded them. We're thinking about writing a script that checks the modification date of files and only downloads the latest versions.

Since this can't be the first time this problem has been encountered or solved, I thought I'd post this and see if anyone knows of existing solutions that can be applied here. An existing solution would need to be compatible with FreeBSD and/or LAMP.

+3  A: 

Is there any reason you can't use rsync?

peejaybee
+1 : I would definitly go with rsync, for that kind of job : it's fast, incremental, reliable, and works via a crypted channel, which is always nice.
Pascal MARTIN
I only have FTP access to the server, and I don't believe rsync allows synchronization over FTP. Is synchronization over FTP possible with rsync?
+2  A: 

with wput

dmityugov
In my case it is wget, but that led me to the solution. wget --mirror ftp://username:[email protected]/path/ -t 100
Oh, indeed. Read too late, read too fast. It is also possible to use curl for that, although it is not as simple as with wget: http://curl.haxx.se/mail/archive-2005-11/0082.html
dmityugov