tags:

views:

39

answers:

1

Hello all.

I'm rewriting a VB 6.0 program in C# .Net for an internship. It measures incoming data from a precision grinding machine and gives a readout on its accuracy. A lot of the meat of the program is done within a Timer object, with an interval of 10. The program is unfinished, so it has yet to be tested on an actual machine, but I have a Debug simulation that generates a Sin wave (to ensure that, upon receiving data, the dials and numbers all operate as expected).

I'm using WinForms, and the Timer object, not System.Threading.Timer (though if that sounds like a better option, by all means I'm open to using it). I used System.Diagnostics.Stopwatch ElapsedTicks to time just how many ticks it was taking the Timer to cycle from beginning to end, on average. On Form1, the timer generally takes around 19500 ticks. When it switches to Form2, that drops to around 15000, and even lower to 11000 on Form3.

Now, it makes sense that Form3 is evaluating faster, because it skips some parts of the code that don't need to be run. Form2 should be running the entire Timer. What's stranger is that after switching to Form2, and then back to Form1, the timer remains at around 16000, and gets slightly faster every time I switch between forms. The interval always stays a constant 10.

I'm not sure what's doing it, or how I can force it to run at a constant rate. Furthermore, I'm not sure if it even matters. It's been bugging me immensely, but I'm not sure it will matter that it's evaluating faster when it program is receiving actual data from a machine.

If anyone has any suggestions as to why the timer is acting the way it is, or how to limit it to run at a constant rate, or to even if I should worry about it or not, they would be greatly appreciated.

Thanks for your time

+1  A: 

When you mean cycle, do you mean the time it takes to execute the event logic or the time between timer ticks? The UI timer may not be that accurate because it operates on the UI thread.

To be honest, it sounds like you need real-time requirements, something that you aren't going to get easily (or, at all). You also have to contend with garbage collection arbitrarily running throughout.

I wouldn't worry about it, though I would possibly use a threaded timer, not a UI timer.

Adam
Yeah, I was using "cycle" to refer to the time it took the timer to execute its code, not an actual CPU cycle.I'll spend a little time and switch it over to the threaded timer, see if that might make a difference. I know it seems to be more flexible. Thanks for the advice.
KChaloux
It won't make any difference to the actual time it takes to execute. Unfortunately you have no control over this. Arbitrary factors will get in the way - not least of which is that the OS isn't hard real-time. You shouldn't ever need to worry about the minute differences in code execution speeds.
Adam
If you do, then C# running under the CLR will never work - look into real-time systems using native code. Of course, ignoring the fact that most popular OSs aren't real-time either.
Adam
Yeah, unfortunately I don't have much of a choice as to what language I'm using here. The company decided on C#, so it's what I'm learning and working with.
KChaloux
I'm guessing then that the execution fluctuations are safe to ignore :)
Adam
Please mark this post as the answer if you're happy with it.
Adam
Indeed I am. Sorry about not doing it sooner. I'm actively working on the program.
KChaloux