Hi, I am moving a bunch of sites to a new server, and to ensure i don't miss anything, want to be able to give a program a list of sites and for it to download every page/image on there. Is there any software that can do this? I may also use it to download a copy of some wordpress sites, so i can just upload static files (some of my WP sites never get updated, so hardly worth setting up new dbs etc)
+2
A:
You'll probably get lots of opinions. Here is one: http://www.httrack.com/
leonm
2009-10-24 12:10:35
A:
wget is your tool.
on unix/linux systems, it may already be installed. for windows systems, download it from http://gnuwin32.sourceforge.net/packages/wget.htm.
it is a command line tool, with a bunch of command line options for controlling the way it crawls the target website. use "wget --help" to list all available options.
Adrien Plisson
2009-10-24 12:13:55