I have single-thread windows form application written with VB.NET and targeting Framework 1.1. The software communicates with external boards through a serial interface, and it mainly consist of a state machine that run some tests, driven in a loop done with a Timer and an Interval of 50ms.
The feedback on the user interface is done through some custom events raised during the tests.
The problem that is driving me crazy is that the performance slightly decrease over time, and in particular after 1200/1300 test operations. The memory occupied does not increase over time, it is only the CPU that seems interested by this problem.
The strange thing is that, targeting framework 2.0 and using the same identical code, I do not have this problem.
I know that is difficult without looking at the code, but do you have suggestions how can I approach the problem?
EDIT: I am really lost, after a couple of intensive work the application starts slowing down. The selected row is related to its process, if it could help.
EDIT2: Using the Windows Task Manager I detected that the Handles counter is increased by 1 at the end of each operation. I don't know if it is the cause but the application starts to slow down when the handles counter reaches about 1500 handles. I checked that all necessary RemoveHandler are called after each operation. Any idea?
EDIT3: I found that the handles problem is generated by the C++ library we are using to communicated with the serial device. It then happens both in .NET 1.1 and .NET 2.0. The difference, and that's strange, is that if the target .NET 1.1 the application slow down/freeze instead for .NET 2.0 I reached more than 30000 handles without loosing performances. Now I don't know if the problem is really caused by this lost handles, I will try to ask to the developers of C++ library to correct the problem and see if it solves the problem I am having on .NET 1.1.