views:

11

answers:

1

HI,

We have a device on the field which sends TCP packets to our server once a day. I have a Windows service which constantly listens for those packets. The code in the service is pretty much a carbon copy from the MSDN example (Asynchronous Server Socket Example) – the only difference being our implementation doesn't send anything back. It just receives, processes the data and closes the socket. The service just starts a thread which immediately addresses the code linked above.

The problem is that when I goto the Task Manager of the server on which it is running, the service seems be to using all of the CPU (it says 99) all the time. I was notified of this by IT. But I don't understand what those CPU cycles are being used for, the thread just blocks on allDone.WaitOne() doesn't it?

I also made a Console Application with the same code, and that works just fine i.e. using CPU only when data is being processed. The task in each case is completed successfully each time, but the service implementation, from the looks of it seems very inefficient. What could I be doing wrong here?

Thanks.

+1  A: 

Use a profiler to find out where your CPU is spent. That should have been yourfirst throught - profilers are one of the main tools for programmers.

It will pretty much exactly tell you what part of the code blows the CPU.

The code, btw., looks terrible. Like an example how to use async cockets, not like a good architecture for a multi connection server. Sorry to say, you possibly have to rewrite thse.

TomTom