views:

639

answers:

4

I have a huge file with lots of links to files of various types to download. Each line is one download command like:

wget 'URL1'

wget 'URL2'

...

and there are thousands of those.

Unfortunately some URLs look really ugly, like for example: http://www.cepa.org.gh/archives/research-working-papers/WTO4%20(1)-charles.doc It opens OK in a browser, but confuses wget.

I'm getting an error:

./tasks001.sh: line 35: syntax error near unexpected token `1'

./tasks001.sh: line 35: `wget 'http://www.cepa.org.gh/archives/research-working-papers/WTO4%20(1)-charles.doc''

I've tried both URL and 'URL' ways of specifying what to download.

Is there a way to make a script like that running unattended? I'm OK if it'll just skip the file it couldn't download.

+1  A: 

I think you need to used double-quotes (") and not single quotes (') around the URL.

If that still doesn't work, try escaping the paren characters ( and ) with a backslash: \( and \)

Which shell are you using? Bash? zsh?

scraimer
Thank you, looks like it works. :)
A: 

This doesn't exactly answer your question but:

Both of the following commands work directly in a bash shell:

wget "http://www.cepa.org.gh/archives/research-working-papers/WTO4%20(1)-charles.doc"

and

wget 'http://www.cepa.org.gh/archives/research-working-papers/WTO4%20(1)-charles.doc'

Can you check to see if either of those work for you?

SCdF
A: 

What seems to be happening is that your shell is doing something with the ( characters. I would try using double quotes " instead of single quotes ' around your URL.

If you wish to suppress errors you can use a >/dev/null under unix to redirect standard output or 2> /dev/null to redirect standard error. Under other operating systems it may be something else.

Brad Smith
+2  A: 

Do not (ab)use the shell.

Save your URLs to some file (let's say my_urls.lst) and do:

wget -i my_urls.lst

Wget will handle quoting etc on it's own

ADEpt