HI,
We have a device on the field which sends TCP packets to our server once a day. I have a Windows service which constantly listens for those packets. The code in the service is pretty much a carbon copy from the MSDN example (Asynchronous Server Socket Example) – the only difference being our implementation doesn't send anything back. It just receives, processes the data and closes the socket. The service just starts a thread which immediately addresses the code linked above.
The problem is that when I goto the Task Manager of the server on which it is running, the service seems be to using all of the CPU (it says 99) all the time. I was notified of this by IT. But I don't understand what those CPU cycles are being used for, the thread just blocks on allDone.WaitOne() doesn't it?
I also made a Console Application with the same code, and that works just fine i.e. using CPU only when data is being processed. The task in each case is completed successfully each time, but the service implementation, from the looks of it seems very inefficient. What could I be doing wrong here?
Thanks.