views:

183

answers:

2

Short radio link with a data source attached with a needed throughput of 1280 Kbps over IPv6 with a UDP Stop-and-wait protocol, no other clients or noticeable noise sources in the area. How on earth can I calculate what the best packet size is to minimise overhead?

UPDATE

I thought it would be an idea to show my working so far: IPv6 has a 40 byte header, so including ACK responses, that's 80 bytes overhead per packet. To meet the throughput requirement, 1280 K/p packets need to be sent a second, where p is the packet payload size.

So by my reckoning that means that the total overhead is (1280 K/p)*(80), and throwing that into Wolfram gives a function with no minima, so no 'optimal' value.

I did a lot more math trying to shoehorn bit error rate calculations into there but came up against the same thing; if there's no minima, how do I choose the optimal value?

A: 

Your best bet is to use a simulation framework for networks. This is a hard problem, and doesn't have an easy answer.

NS2 or SimPy can help you devise a discrete event simulation to find optimal conditions, if you know your model in terms of packet loss.

Yann Ramin
A: 

Always work with the largest packet size available on the network, then in deployment configure the network MTU for the most reliable setting.

Consider latency requirements, how is the payload being generated, do you need to wait for sufficient data before sending a packet or can you immediately send?

The radio channel is already optimized for noise as the low packet level, you will usually have other demands of the implementation such as power requirements: sending in heavy batches or light continuous load.

Steve-o
Thats exactly what i would do in real life, unfortunately University is not real life. Thanks
Andrew Bolster