views:

1648

answers:

6

I want to download a lot of urls in a script but I do not want to save the ones that lead to HTTP errors.

as far as I read in the man they both don't provide such functionality. does anyone know about another downloader who does?

A: 

If you're still looking after awhile, hacking up such a downloader in your language of choice shouldn't be too difficult.

William Keller
+3  A: 

I think the -f option to curl does what you want.

Thomas
no, it's only related to verbosity of errors.once used, errors do are not reported (but saved as usual)thanks anyway
kiwi
+1  A: 

I voted up the curl -f answer above, but to expound that it isn't failsafe. I think this is a perfect opportunity for learning Perl or Ruby (or extending your skills) by writing your own download program.

jtimberman
A: 

I'm convinced to write my own script

kiwi
+4  A: 

To do this with wget:

on unix you can do:

wget -O /dev/null example.com

while on windows the equivalent is:

wget -O NUL example.com
Tristan Havelick
A: 

One liner I just setup for this very purpose:

(works only with a single file, might be useful for others)

A=$$; ( wget -q "http://foo.com/pipo.txt" -O $A.d && mv $A.d pipo.txt ) || (rm $A.d; echo "Removing temp file")

This will attempt to download the file from the remote Host. If there is no Error, the file is not kept. In all other cases, it's kept and renamed.

Oct