Hello guys,
I am developing an application in C#, using the server-client model, where the server sends a byte array with a bitmap to the client, the client loads it into the screen, sends an "OK" to the server, and the server sends another image, and so on.
The length of the image buffer deppends, usually it is between 60kb and 90kb, but I've seen that it doesn't matter. If I put the client and the server in the same computer, using localhost, everything works fine. The server does beginSend, and the client does endReceive and the whole buffer is transmitted.
However, I am now testing this in a wireless network and what happens is:
- The server sends the image.
- The callback function data_received on the client is called, but there are only 1460 bytes to read (MTU - why? shouldn't only be in UDP?)
- The callback function data_received on the client is called again, now with the rest of the buffer (either it be 1000 bytes or 100 kbytes)...
It's always like this, a first packet with 1460 bytes is received, and then the second packet contains the rest.
I can work around this by joining both byte arrays received, but this doesn't seem right. I'm not even sure why this is happening. Is it some restriction on the network? Why then doesn't C# wait for the whole data to be transmitted? I mean, it's TCP, I shouldn't have to worry about it, right?
Anyway, any help would be great!
Cheers