tags:

views:

932

answers:

5

I am building a C# application, using the server-client model, where the server is sending an image (100kb) to the client through a socket every 50ms...

I was using TCP, but besides the overhead of this protocol, sometimes the client ended up with more than one image on the socket. And I still haven't though of a clever mechanism to split the bytes of each image (actually, I just need the most recent one).

I tried using UDP, but got to the conclusion that I can't send 100kb dgrams, only 64kb ones. And even so, I shouldn't use more than 1500bytes; otherwise the packet would be divided along the network and the chances of losing parts of the packet would be greater.

So now I'm a bit confused. Should I continue using TCP and put some escaping bytes in the end of each image so the client can separate them? Or should I use UDP, send dgrams of 1500 bytes and come up with a mechanism for ordering and recovering?

The key goal here is transmitting the images very fast. I don't mind losing some on the way as long as the client keeps receiving newer ones.

Or should I use another protocol? Thanks in advance!

+13  A: 

You should consider using Real-time Transport Protocol (aka RTP).

The underlying IP protocol used by RTP is UDP, but it has additional layering to indicate time stamps, sequence order, etc.

RTP is the main media transfer protocol used by VoIP and video-over-IP systems. I'd be quite surprised if you can't find existing C# implementations of the protocol.

Also, if your image files are in JPEG format you should be able to produce an RTP/MJPEG stream. There are quite a few video viewers that already have native support for receiving and displaying such a stream, since some IP webcams output in that format.

Alnitak
A: 

If the latest is more important than every picture, UDP should be your first choice.

But if you're dealing with frames lager than 64K your have to-do some form of re-framing your self. Don't be concerned with fragmented frames, as you'll have to deal with it or the lower layer will. And you only want completed pictures.

What you will want is some form of encapsulation with timestamps/sequences.

Simeon Pilgrim
A: 

I'd recommend using UDP if:

  • Your application can cope with an image or small burst of images not getting through,
  • You can squeeze your images into 65535 bytes.

If you're implementing a video conferencing application then it's worth noting that the majority use UDP.

Otherwise you should use TCP and implement an approach to delimit the images. One suggestoin in that regard is to take a look at the RTP protocol. It's sepcifically designed for carrying real-time data such as VoIP and Video.

Edit: I've looked around quite a few times in the past for a .Net RTP library and apart from wrappers for non .Net libraries or half completed ones I did not have much success. I just had another quick look and this may be of this one ConferenceXP looks a bit more promising.

sipwiz
+2  A: 

First of all, your network might not be able to handle this no matter what you do, but I would go with UDP. You could try splitting up the images into smaller bits, and only display each image if you get all the parts before the next image has arrived.

Also, you could use RTP as others have mentioned, or try UDT. It's a fairly lightweight reliable layer on top of UDP. It should be faster than TCP.

Zifre
+1  A: 

The other answers cover good options re: UDP or a 'real' protocol like RTP.

However, if you want to stick with TCP, just build yourself a simple 'message' structure to cover your needs. The simplest? length-prefixed. First, send the length of the image as 4 bytes, then send the image itself. Easy enough to write the client and server for.

Jonathan
I agree that is how you would frame this data over TCP, but that defeats the new is best idea. Therefore delaying new because of lost old is really not wanted.
Simeon Pilgrim