views:

52

answers:

4

http://examples.oreilly.com/9780735615366/

I actually want to be able to have all these files in my disk.

as u can see there are many folders each with different type of files.

and u cannot download "the-folder" directly...only individual files

~

is there any way to automate process..?

I will need regular expressions on urls to arrange them in a "folder" like hierarchy.

what do I use...some scripting language like python?

+4  A: 

Take a look at wget tool. It can do exactly what you want.

Vlad Lazarenko
A: 

Try Wget. it's a simple command line utility able to do that.

ruslik
A: 

A cheating answer is to use FTP:

ftp://examples.oreilly.com/pub/examples/9780735615366/

is for the example you gave...

Basiclife
if I do "ftp" again..I am not able to download "whole-folders" only single files
basic
ok FTP helped..all I did was wget -m ftp://examples.oreilly.com/pub/examples/9780735615366/
basic
FTP requries an FTP client - windows comes with one built in
Basiclife
Open "My Computer" and copy/paste the ftp address into the address bar. It should open up looking like a normal windows folder - you can copy/pasyte any directories you like
Basiclife
+1  A: 

wget (GNU command line tool) will do this for you. The documentation for what you want to do is here: http://www.gnu.org/software/wget/manual/html_node/Recursive-Retrieval-Options.html

Toucan