views:

87

answers:

3

i need to write a line in my script to download a directory(having about 10 images)from a url like abc.com/Image/images/,trying wget command as below in the script :

wget -e robots=off -r -l1 --no-parent -A.gif http://abc.com/Image/images/ OR wget -A "*.gif" http://abc.com/Image/images/

but it is giving error as :

HTTP request sent, awaiting response... 403 Forbidden 11:25:12 ERROR 403: Forbidden. Removing abc.com/Images/images/index.html since it should be rejected. unlink: No such file or directory

I am already using the wget command to download one file from the same URL and it is working fine but in case of directory it is failing.

Can anyone help me quickly in this?

+1  A: 

I suppose you can't list the directory on the webserver. wget is getting information about the contents of the directory you want to download from its web listing. If that is not allowed (403: Forbidden), then wget can't download the files. Of course, I may be wrong

Shade
but if i specify the image name in the url as :http://abc.com/Image/images/abc.gif then i am able to download this images and same for other images in the folder.But when trying to get the whole folder it is giving error.
ha22109
Indeed. You see, when you give a direct URL as a parameter, then wget has all the information needed to download that file. But a directory name does not give information about its contents. So, when you pass wget a directory name, it requests that directory (as if it is a file). When one requests a directory from a webserver, the webserver tries to serve a listing of that directory if an index.html (or other index file) is not present. Since that is not allowed on this webserver, wget doesn't get a listing and doesn't know what to download.
Shade
A: 

This is not possible.

wget cannot know which files exist on the remote server.

Instead, if the server has directory browsing enabled, or if the images are linked somewhere, you can crawl some other page.

SLaks
+1  A: 

If you know names (you have to, if the remote dir is not "open" and the content can't be listed), consider having them into a file and use a for or while (in bash or powershell or what you have) or similar; if the names follow a "pattern", consider using curl instead, with it you can do things like

curl http://asdf.com/what/ever/image/img[00-99].gif -o img#1.gif

to download images with names img00.gif, img01.gif and so on.

ShinTakezou