tags:

views:

1087

answers:

2

In C#, in order to use the UdpClient.Send() method I must provide as one of the parameters the number of bytes I am sending.

How do I calculate the number of bytes in a datagram before sending it?

A: 

Not sure what language you are using to impement this UDP client. In C++, sizeof operator provides the number of bytes. Other approach would be to use strlen() or their unicode variants & multiply by the data type size.

msvcyc
am sorry..i am using c#
Avik
+4  A: 

You pass UdpClient.Send() an array of bytes (Byte[]), an integer size, and an IPEndPoint. If you are sending the entire byte array, nothing more and nothing less, as your datagram's payload, you can just use the Length property of arrays as follows:

UdpClient udpClient = new UdpClient();
IPAddress ipAddress = Dns.Resolve("www.contoso.com").AddressList[0];
IPEndPoint ipEndPoint = new IPEndPoint(ipAddress, 11004);    

Byte[] sendBytes = Encoding.ASCII.GetBytes("Is anybody there?");
try{
    udpClient.Send(sendBytes, sendBytes.Length, ipEndPoint);
}
catch ( Exception e ){
    Console.WriteLine(e.ToString());    
}

Perhaps the confusion here is that you think you have to count the number of bits that will be sent out over the wire? What is actually required is just the size of the payload (the part of the provided byte array you actually want to send in this datagram). The library will do the rest.

Examples and info here.

Matt J