views:

451

answers:

1

I'm trying to download images (.jpg) from web folder using wget. I want to download only images, which have a certain sentences in file name. This works fine

wget -r -nd -A .jpg http://www.examplewebsite.com/folder/

but I like to include a sentence eg. "john". I tried

wget -r -nd -A .jpg '*john*' http://www.examplewebsite.com/folder/

with no success. Any ideas how to proceed? Could this be implemented into a shell script (bash shell). I'm using Mac OSX 10.6.1.

A: 

My answer depends on whether you can get a directory listing from the webserver. If you can, you can save that HTML listing, extract the links, filter the list of links for ones that match your criterion, and then pass that filtered list to curl or wget.

If you can't get a list in that way, you need to generate a list in some other way, which probably will be manual. (I'm assuming the only access you have to this server is over http.)

Schof