views:

125

answers:

2

It's actually a noticeable difference that I've seen but cannot explain. These timers have intervals set to 1ms (the lowest available), but while it's minimized, it seems to tick faster? Could anyone explain this phenomenon to me? And if possible, explain how to reproduce the effect while the window is maximized?

+2  A: 

If I remember correctly, the minimum resolution you can get out of a System.Windows.Forms.Timer (which I assume is what you're using here) is 55 ms. Setting it to 1 ms essentially means that it ticks continuously.

Of course, a timer doesn't guarantee that ticks will arrive at exactly the interval specified. If your app is busy doing other things (like redrawing the screen) then it may take a few more ms, or significantly more under heavy load. If the timer is set to an interval of 1 second, you won't really notice this, but at the minimum window (55 ms), you might.

When the application is minimized, there are fewer other events that can interrupt the timer events before they fire.

Aaronaught
+1 From me. Your memory is better than mine, I had to look up the timer interval!
Mitch Wheat
+3  A: 

Is this a Forms.Timer?

I doubt it is running faster, more likely that the Timer firing event is being handled in a more timely manner. Whilst minimised there will presumably be less messages handled by the Form windows's message pump, which could account for a larger time slice to handle the Timer messages. There is also the isue of the minimum Timer resolution.

If applicable, try using one of the other Timer types, such as System.Timers

The Windows Forms Timer component is single-threaded, and is limited to an accuracy of 55 milliseconds. If you require a multithreaded timer with greater accuracy, use the Timer class in the System.Timers namespace.

Ref.

Mitch Wheat