views:

241

answers:

2

Hi

This is a pretty simple question really. If I use setInterval(something, 1000), can I be completely sure that after, say, 31 days it will have triggered "something" exactly 60*60*24*31 times? Or is there any risk for so called drifting?

+6  A: 

Short answer: No, you can't be sure. Yes, it can drift.

Long answer: John Resig on the Accuracy of JavaScript Time and How JavaScript Timers Work.

From the second article:

In order to understand how the timers work internally there's one important concept that needs to be explored: timer delay is not guaranteed. Since all JavaScript in a browser executes on a single thread asynchronous events (such as mouse clicks and timers) are only run when there's been an opening in the execution.

Both articles (and anything on that site) is great reading, so have at it.

Paolo Bergantino
+2  A: 

Here's a benchmark you can run in Firefox:

var start = +new Date();
var count = 0;
setInterval(function () {
    console.log((new Date() - start) % 1000,
    ++count,
    Math.round((new Date() - start)/1000))
}, 1000);

First value should be as close to 0 or 1000 as possible (any other value shows how "off the spot" the timing of the trigger was.) Second value is number of times the code has been triggered, and third value is how many times the could should have been triggered. You'll note that if you hog down your CPU it can get quite off the spot, but it seems to correct itself. Try to run it for a longer period of time and see how it handles.

Blixt