views:

81

answers:

4

Just looking at various ways of ftp'ing a file in c#. I noticed some examples do:

streamReader.ReadToEnd()

then convert to bytes, then send the file in one go.

while others do a:

while (contentLength != 0)
    stream.write(buff, 0, contentLength);
    contentLength = fileStream.Read(buff, 0, buffLength);

Is sending the file 2048 at a time for larger files, while the other method is fine for files in the 10-50K range?

+1  A: 

It really depends on the size of files the system you are designing is dealing with. If you have very large files the whole file has to be held in the buffer and the servers memory will get used up quickly as a result if you don't "chunk" the file into various parts when streaming.

Sheff
+1  A: 

If you can only read or write the file all at once, you have to allocate as much space as is required for the entire file. That can be cumbersome, especially when you don't know how big the file is going to be in advance. It's also bad for slower connections, because you won't be able to use any of the file until the whole thing is finished, which is obviously terrible for applications like streaming movies or audio. Buffering is a good general strategy to handle this sort of case.

John Feminella
A: 

If you have huge files its better not to strain the memory with the big get, instead break it down into pieces.

John G
A: 

The second code that uses a smaller 2K buffer is almost always going to be better in terms of memory, and in terms of time will probably be a very insignificant difference:

http://en.wikipedia.org/wiki/Space-time%5Ftradeoff

Also, your code uses a while loop with two lines indented, but the second line is not actually part of the loop since it's not enclosed in braces.

Kristopher Ives