Hello all.
I'm rewriting a VB 6.0 program in C# .Net for an internship. It measures incoming data from a precision grinding machine and gives a readout on its accuracy. A lot of the meat of the program is done within a Timer object, with an interval of 10. The program is unfinished, so it has yet to be tested on an actual machine, but I have a Debug simulation that generates a Sin wave (to ensure that, upon receiving data, the dials and numbers all operate as expected).
I'm using WinForms, and the Timer object, not System.Threading.Timer (though if that sounds like a better option, by all means I'm open to using it). I used System.Diagnostics.Stopwatch ElapsedTicks to time just how many ticks it was taking the Timer to cycle from beginning to end, on average. On Form1, the timer generally takes around 19500 ticks. When it switches to Form2, that drops to around 15000, and even lower to 11000 on Form3.
Now, it makes sense that Form3 is evaluating faster, because it skips some parts of the code that don't need to be run. Form2 should be running the entire Timer. What's stranger is that after switching to Form2, and then back to Form1, the timer remains at around 16000, and gets slightly faster every time I switch between forms. The interval always stays a constant 10.
I'm not sure what's doing it, or how I can force it to run at a constant rate. Furthermore, I'm not sure if it even matters. It's been bugging me immensely, but I'm not sure it will matter that it's evaluating faster when it program is receiving actual data from a machine.
If anyone has any suggestions as to why the timer is acting the way it is, or how to limit it to run at a constant rate, or to even if I should worry about it or not, they would be greatly appreciated.
Thanks for your time