views:

1433

answers:

3

I have written a program that will etablish a network connection with a remote computer using TCPClient I am using it to transfer files in 100k chunks to a remote .net application and it inturn then writes them to the HardDrive. All file transfers work good except when it comes to ZIP files - it is curious to note that the reasembled file is always 98K...is there some dark secret to ZIP files that prevent them from being handled in this manner. Again all other file transfers work fine, image, xls, txt, chm, exe etc.

Confused

A: 

It might be that you are overwriting (instead of appending to) the existing file with each chunk received? Therefore the file's final size will be <= the size of one chunk.

But without any code, it's difficult to tell the reason for the problem.

M4N
+2  A: 

Well, you haven't shown any code so it's kinda tricky to say exactly what's wrong.

The usual mistake is to assume that Stream.Read reads all the data you ask it to instead of realising that it might read less, but that the amount it actually read is the return value.

In other words, the code shouldn't be:

byte[] buffer = new byte[input.Length];
input.Read(buffer, 0, buffer.Length);
output.Write(buffer, 0, buffer.Length);

but something like:

byte[] buffer = new byte[32 * 1024];
int bytesRead;
while ( (bytesRead = input.Read(buffer, 0, buffer.Length)) > 0)
{
    output.Write(buffer, 0, bytesRead);
}

But that's just a guess. If you could post some code, we'd have a better chance of figuring it out.

Jon Skeet
+1  A: 

The actual code would be helpful.

Are you using BinaryReader / BinaryWriter?

(i.e. data based rather than text based).

You could try using a hex file compare (e.g. Beyond Compare) to compare the original and copy and see if that gives you any clues.

nzpcmad