views:

334

answers:

4

I'm developing a server application that asynchronously accepts TCP connections (BeginAccept/EndAccept) and data (BeginReceive/EndReceive). The protocol requires an ACK to be sent whenever the EOM character is found before it will send the next message. The accept and receive are working but the sending app is not receiving the ACK (sent synchronously).

    private void _receiveTransfer(IAsyncResult result)
    {
        SocketState state = result.AsyncState as SocketState;
        int bytesReceived = state.Socket.EndReceive(result);

        if (bytesReceived == 0)
        {
            state.Socket.Close();
            return;
        }

        state.Offset += bytesReceived;
        state.Stream.Write(state.Buffer, 0, bytesReceived);

        if (state.Buffer[bytesReceived - 1] == 13)
        {
            // process message
            Messages.IMessage message = null;
            try
            {
                var value = state.Stream.ToArray();

                // do some work
                var completed = true;

                if (completed)
                {
                    // send positive ACK
                    var ackMessage = string.Format(ack, message.TimeStamp.ToString("yyyyMMddhhmm"), message.MessageType, message.Id, "AA", message.Id);
                    var buffer = ASCIIEncoding.ASCII.GetBytes(ackMessage);
                    int bytesSent = state.Socket.Send(buffer, 0, buffer.Length, SocketFlags.None);
                }
                else
                {
                    // send rejected ACK
                    var ackMessage = string.Format(ack, message.TimeStamp.ToString("yyyyMMddhhmm"), message.MessageType, message.Id, "AR", message.Id);
                    state.Socket.Send(ASCIIEncoding.ASCII.GetBytes(ackMessage));
                }
            }
            catch (Exception e)
            {
                // log exception


                // send error ACK
                if (message != null)
                {
                    var ackMessage = string.Format(ack, DateTime.Now.ToString("yyyyMMddhhmm"), message.MessageType, message.Id, "AE", message.Id);
                    state.Socket.Send(ASCIIEncoding.ASCII.GetBytes(ackMessage));
                }
            }
        }

        state.Socket.BeginReceive(state.Buffer, 0, state.Buffer.Length, SocketFlags.None, new AsyncCallback(_receiveTransfer), state);
    }

The state.Socket.Send returns the correct number of bytes but the data isn't received until the socket is disposed.

Suggestions are appreciated.

A: 

How long are you giving it? The network stack can buffer, and that could delay transmition. From MSDN:

To increase network efficiency, the underlying system may delay transmission until a significant amount of outgoing data is collected. A successful completion of the Send method means that the underlying system has had room to buffer your data for a network send.

You might want to try flushing using the IOControl method.

edit

Actually, the IOControl flush will kill the buffer. You may want to check out the Two Generals Problem to see if your protocol will have some inherent problems.

David Gladfelter
+2  A: 
  • you shouldn't do anything synchronous from async completion routines. Under load you can end up hijacking all IO completion threads from the thread pool and severly hurt performance, up to and including complete IO deadlock. So don't send ACKs synchronously from async callback.
  • protocols and formats that use preambles are easier to manage that those that use terminators. Ie. write the length of the message in the fixed size message header as opposed to detecting a terminator \0x13. Of course, this applies if the protocol is under your control to start with.

As for your question, you didn't specify if the same code as you posted is also on the client side too.

Remus Rusanu
Unfortunately I can't change the protocol and the client is a verification app for testing protocol compliance.
Steve
Remus Rusanu
A: 

try setting TCP_NODELAY socket option

pm100
A: 

Have you set the NoDelay property on the socket to true? When set to false (the default), data is buffered for up to 200 milliseconds before it's sent. The reason is to reduce network traffic by limiting the number of packets that are sent. Setting NoDelay to true will force the data to be sent sooner.

Jim Mischel