This article on microsoft's tech net site supplies an exe that will calculate your windows machine's minimum time resolution - this should be the smallest "tick" available to any application on that machine:
[http://technet.microsoft.com/en-us/sysinternals/bb897568.aspx][1]
The result of running this app on my current box is 15.625 ms. I have also run tests against Internet Explorer and gotten the exact same resolution from the Date() javascript function.
What is confusing is that the SAME test I ran against IE gives back much finer resolution for Google's Chrome browser (resolution of 1ms) and for a flash movie running in IE (1ms). Can anyone explain how any application can get a clock resolution better then the machine's clock? If so, is there some way I can get a consistantly better resolution in browsers other then Chrome (without using flash)?
The first answer below leads to two other questions:
- How does a multi-media timer get times between system clock ticks. I imagine the system clock as an analog watch with a ticking hand, each tick being 15ms. How are times between ticks measured?
- Are multimedia timers available to browsers, especially Internet Explorer? Can I access one with C# or Javascipt without having to push software to a user's browser/machine?