I have a receive callback from an async client method that is supposed to call socket.BeginReceive
internally to check if data is done being sent. The code is as follows:
private void ReceiveCallback(IAsyncResult ar)
{
try
{
StateObject state = (StateObject)ar.AsyncState;
Socket client = state.workSocket;
int bytesRead = client.EndReceive(ar);
if (bytesRead > 0)
{
state.sb.Append(Encoding.ASCII.GetString(state.buffer, 0, bytesRead));
client.BeginReceive(state.buffer, 0, StateObject.BufferSize, 0,
new AsyncCallback(ReceiveCallback), state);
}
else
{
if (state.sb.Length > 1)
{
response = state.sb.ToString();
}
File.AppendAllText(DataDisplay.Properties.Settings.Default.rawDataLog, response + "\r\n");
receiveDone.Set();
}
}
catch (Exception e)
{
Console.WriteLine(e.ToString());
}
The problem with this is that when the code reaches the client.BeginReceive
call, nothing happens. I know that sounds vague, so let me try to explain. I have set breakpoints throughout the code to see where it could possibly be going, and I can't figure it out. I have tried the else
case, the beginning of the ReceiveCallback
, where the receivedone.Set()
returns to in the main code, the catch
statement, etc. to no avail.
The part that is most confusing is that when I run it with a test server I wrote, it works as I would expect. This code is essentially from the MSDN example on async sockets, in case it looks familiar. Only when I try it with the third party device, I get stymied.
Any ideas what could be going on here?