views:

52

answers:

3

I haven't done this in 3 or 4 years, but a client wants to downgrade their dynamic website into static HTML.

Are there any free tools out there to crawl a domain and generate working HTML files to make this quick and painless?

Edit: it is a Coldfusion website, if that matters.

+1  A: 

It's been a long time since I used it, but webzip was quite good.

It is not free, but for $35.00, I think your client won't go broke.

A quick google for offline browsers came up with this and this that look good..

Oded
+1  A: 

Try using httrack (or webhttrack/winhttrack, if you want a GUI) to spider the web site. It's free, fast, and reliable. It's also much more powerful than primitive downloaders like wget; httrack is designed for mirroring web sites.

Be aware that converting a dynamic page to static will lose you a lot of functionality. It's also not always possible - a dynamic site can present an infinite number of different static pages.

Borealid
I wouldn't call `wget` primitive.
strager
@strager: Ok then, "relatively primitive". It's got a much more restricted feature-set when it comes to mirroring sites.
Borealid
@Borealid I'm not sure whether it can do everything httrack does, but don't underestimate `wget --mirror`! It can do a *lot* of things.
Pekka
+2  A: 

Getleft is a nice Windows client that can do this. It is very configurable and reliable.

Wget can, too, with the --mirror option.

Pekka