My C# class must be able to process a high volume of events received via a tcp stream style socket connection. The volume of event messages received from the tcp server by the class's socket is completely variable. For instance, sometimes it will only receive one event message in a period of ten seconds and at other times it will receive a sixty event messages within a second.
I am using Socket.ReceiveAsync to receive messages. ReceiveAsync returns true if the receive operation is pending or false if there is already data on the wire and the receive operation completed synchronously. If the operation is pending, the Socket will call my callback on an IO completion thread, otherwise I call my own callback in the current (IOC) thread. Furthermore, mixed in with event messages I also receive responses to commands that were sent to this tcp server. Response messages are processed right away; individually, by firing off a threadpool worker.
However, I would like to queue event messages until I have "enough" (N) of them OR until there are no more on the wire...and then fire off a threadpool worker to process a batch of event messages. Also, I want all events to be processed sequentially so I only want one threadpool worker to be working on this at a time.
The processor of event messages need only copy the message buffer into an object, raise an event and then release the message buffer back into the ring-buffer pool. So my question is...what do you think is the best strategy to accomplish this?
Do you need more info? Let me know. Thanks!!