tags:

views:

151

answers:

2

I am looking for a command line tool, or a library (preferably in Perl) to download an HTML page and all its components: external CSS, external JavaScript, images, flash or other objects, etc.

I have not found a tool to do that. I could download the HTML page, and parse the HTML to find all the external links. But I'd rather not re-invent the wheel if an existing tool does that.

+7  A: 

WGet may serve your needs, although I do not know about how it works with CSS.

McWafflestix
I don't know how I missed the -r option!
Julien
Based on the discussion here: http://www.perlmonks.org/?node_id=596482 wget --page-requisites goes most of the way there. Julien, you may be able to contact the question asker there to see what he settled on.
Anonymous
A: 

wget will work easily, but ajax/xul/etc and similar issues it won't handle and for good reason.