views:

117

answers:

1

I want to download thousands of files from the web and save them locally. What is the most efficient way? It is important the failures timeout within 10 seconds.

Is there a better way to stream one stream into another? Maybe a smaller buffer, like 1024 bytes at a time, is more efficient for large files?

Dim w_req As System.Net.HttpWebRequest = CType(System.Net.HttpWebRequest.Create("http://blah.blah.blah/blah.html"), System.Net.HttpWebRequest)
w_req.Timeout = 10000
Dim w_res As System.Net.HttpWebResponse = CType(w_req.GetResponse(), System.Net.HttpWebResponse)
Dim br As New System.IO.BinaryReader(w_res.GetResponseStream())
Dim fs As New System.IO.FileStream(LocalFileName, IO.FileMode.CreateNew, IO.FileAccess.Write, IO.FileShare.Write)
Dim b() As Byte = br.ReadBytes(CInt(br.BaseStream.Length))
fs.Write(b, 0, b.Length)
fs.Close()
br.Close()
A: 

If you want to download multiple files, use multiple threads. That way, waiting for one server's response will not slow down your application.


The stream handling looks fine to me, but I'm not an expert in these matters. If you fetch very large files, it might reduce memory consumption to do it in chunks. In .net 4.0, you would be able to use Stream.CopyTo.

Related question: Best way to copy between two Stream instances - C#

Heinzi
...but bear in mind, writing correct, robust threaded software is more often than not non-trivial...
Mitch Wheat
Of course. Let's assume that I know how to do the threads already. (Which I probably do wrong, too, but nevermind that.) Is the streaming done right?
Eyal
@Eyal: Updated my answer.
Heinzi