I have a GUI app which connects to a sensor, gathers data and processes it in the background using BackgroundWorker
threads.
As it stands I'm posting data to the GUI using the ProgressChanged
which seemed to be working well to begin with. I've since upped the data rates and have discovered a problem; if the software is left to run for a few minutes, the amount of processor usage appears to ramp up until it reaches near 100% on both cores on my machine and at that point, I get an error which reads:
Managed Debugging Assistant 'DisconnectedContext' has detected a problem in 'myapp.exe'. Additional Information: Context 0xe2ba0 is disconnected. Releasing the interfaces from the current context (context 0xe2d10). This may cause corruption or data loss.
I've read some stuff around the web which suggests that this can happen if a GUI app is unable to pump messages fast enough. I've noticed I can provoke the same crash to happen faster if I resize the window rapidly (i.e. pump a load more messages) which supports the theory I think?
So the questions here are:
- Whether anyone agrees with my hypothesis about message pumping?
- Whether there's another explanation?
- Is there some way I can prove it (peek the number of messages in the queue maybe)?
- Are these all bad code smells that suggest I'm going about this the wrong way?
Any advice would be very gratefully received.