Let's say I play a stereo WAV file with 317,520,000 samples, which is theoretically 1 hour long. Assuming no interruptions of the playback, will the file finish playing in exactly one hour, or is there some occasional tiny variation in the playback speed such that it would be slightly more or slightly less (by some number of milliseconds) than one hour?
I am trying to synchronize animation with audio, and I am using a System.Diagnostics.Stopwatch
to keep the frames matching the audio. But if the playback speed of WAV audio in Windows can vary slightly over time, then the audio will drift out of sync with the Stopwatch-driven animation.
Which leads to a second question: it appears that a Stopwatch
- while highly granular and accurate for short durations - runs slightly fast. On my laptop, a Stopwatch
run for exactly 24 hours (as measured by the computer's system time and a real stopwatch) shows an elapsed time of 24 hours plus about 5 seconds (not milliseconds).
Is this a known problem with Stopwatch
? (A related question would be "am I crazy?", but you can try it for yourself.) Given its usage as a diagnostics tool, I can see where a discrepancy like this would only show up when measuring long durations, for which most people would use something other than a Stopwatch
.
If I'm really lucky, then both Stopwatch
and audio playback are driven by the same underlying mechanism, and thus will stay in sync with each other for days on end. Any chance this is true?
Update: I just did the math, and if Stopwatch
drifts by 5 seconds over 24 hours, this means it will drift by 10 milliseconds after just 172 seconds. So in 3 minutes the animation will start being perceptably out of sync.
I'm experimenting with periodically (every 10 seconds or so) re-starting the timer from the waveOutWrite callback, but this isn't working because then the whole next set of timer events is offset by whatever the inaccuracy of the callback happened to be. Sucks to be me.