Edit: This is done in the Compact Framework, I don't have access to WebClient therefore it has to be done with HttpWebRequests.
I am creating a download manager application that will be able to have concurrent downloads (more than one download at once) and the ability to report the percentage completed and resume the downloads.
This means that I am downloading some bytes into a buffer and then writing the buffer to disk. I just wanted to check what recommended algorithm/procedure is for this.
This is what I have thus far for the main download method:
private void StartDownload()
{
HttpWebRequest webReq = null;
HttpWebResponse webRes = null;
Stream fileBytes = null;
FileStream saveStream = null;
try
{
webReq = (HttpWebRequest)HttpWebRequest.Create(_url);
webReq.Headers.Add(HttpRequestHeader.Cookie, "somedata");
webRes = (HttpWebResponse)webReq.GetResponse();
byte[] buffer = new byte[4096];
long bytesRead = 0;
long contentLength = webRes.ContentLength;
if (File.Exists(_filePath))
{
bytesRead = new FileInfo(_filePath).Length;
}
fileBytes = webRes.GetResponseStream();
fileBytes.Seek(bytesRead, SeekOrigin.Begin);
saveStream = new FileStream(_filePath, FileMode.Append, FileAccess.Write);
while (bytesRead < contentLength)
{
int read = fileBytes.Read(buffer, 0, 4096);
saveStream.Write(buffer, 0, read);
bytesRead += read;
}
//set download status to complete
//_parent
}
catch
{
if (Thread.CurrentThread.ThreadState != ThreadState.AbortRequested)
{
//Set status to error.
}
}
finally
{
saveStream.Close();
fileBytes.Close();
webRes.Close();
saveStream.Dispose();
fileBytes.Dispose();
saveStream = null;
fileBytes = null;
webRes = null;
webReq = null;
}
}
Should I be downloading a larger buffer? Should I be writing the buffer to file so often (every 4KB?) Should there be some thread sleeping in there to ensure not all the CPU is used? I think reporting the progress change every 4KB is stupid so I was planning to do it every 64KB downloaded.
Looking for some general tips or anything that is wrong with my code so far.