views:

123

answers:

2

Let's say I play a stereo WAV file with 317,520,000 samples, which is theoretically 1 hour long. Assuming no interruptions of the playback, will the file finish playing in exactly one hour, or is there some occasional tiny variation in the playback speed such that it would be slightly more or slightly less (by some number of milliseconds) than one hour?

I am trying to synchronize animation with audio, and I am using a System.Diagnostics.Stopwatch to keep the frames matching the audio. But if the playback speed of WAV audio in Windows can vary slightly over time, then the audio will drift out of sync with the Stopwatch-driven animation.

Which leads to a second question: it appears that a Stopwatch - while highly granular and accurate for short durations - runs slightly fast. On my laptop, a Stopwatch run for exactly 24 hours (as measured by the computer's system time and a real stopwatch) shows an elapsed time of 24 hours plus about 5 seconds (not milliseconds).

Is this a known problem with Stopwatch? (A related question would be "am I crazy?", but you can try it for yourself.) Given its usage as a diagnostics tool, I can see where a discrepancy like this would only show up when measuring long durations, for which most people would use something other than a Stopwatch.

If I'm really lucky, then both Stopwatch and audio playback are driven by the same underlying mechanism, and thus will stay in sync with each other for days on end. Any chance this is true?

Update: I just did the math, and if Stopwatch drifts by 5 seconds over 24 hours, this means it will drift by 10 milliseconds after just 172 seconds. So in 3 minutes the animation will start being perceptably out of sync.

I'm experimenting with periodically (every 10 seconds or so) re-starting the timer from the waveOutWrite callback, but this isn't working because then the whole next set of timer events is offset by whatever the inaccuracy of the callback happened to be. Sucks to be me.

+2  A: 

No clock will measure time "exactly", as all physical devices are bound to have some variations and measurement errors. This means that ALL clocks will be slightly too fast or too slow (though the amount of error may differ wildly, depending on the clock).

In your case, the audio output is driven by the clock on the soundcard which drives the DAC. I don't know the .NET platform, but i assume that Stopwatch is some kind of system timer, which means it is driven by a DIFFERENT clock (the one on your motherboard, presumably).

Now in general, two different physical clocks will NEVER run at exactly the same speed - for the reasons outlined above. That is where the discrepancy you got came from. The same thing will happen to your animation - you can absolutely never assume the system clock and the soundcard DAC clock are the same - they will differ!

This means that if you want to keep two streams (video and audio) synchronized, they must be driven by the same clock. As you cannot change the clock that drives your soundcard, it's a good bet to sync everything to the soundcard.

slacker
I think you're absolutely right that audio speed would be driven by the chip on the soundcard, and there's no way Stopwatch is driven by the same chip. The problem I ran into with syncing to the soundcard is that the API I'm using (waveOutOpen and waveOutWrite) communicates back to the calling app via messages, so while these events generally keep in sync with the audio, they're so erratically timed that the animation is noticeably jerky (I had to switch to using Stopwatch to make the animation smooth).
MusiGenesis
Basically, Stopwatch is precise enough for smooth animation, but eventually gets out of sync with the music. The waveOut callbacks stay in sync forever, but aren't smooth enough for animation. I need to come up with some way of combining the two, like using the callbacks to periodically adjust the Stopwatch.
MusiGenesis
FWIW, in Windows the video clock is driven by the audio clock. If you're rendering audio, you get the audio clock by calling waveOutGetPosition (since audio is isochronous, the position is directly corrolated with time). All the other audio rendering APIs have a similar "GetPosition" API that can be used to determine the audio rendering position.
Larry Osterman
@MusiGenesis:Win32 messages in general are not meant for timing-sensitive communication. Try requesting callbacks instead of messages. Also, note that with such high sync requirements, you would be probably better off with an API like DirectSound.
slacker
@Larry: thank you for this info. I feel like a dunce for not knowing about a method designed for exactly this problem. I only recently started doing animation complex enough for me to notice the slightly erratic timing of the waveOutWrite callbacks.
MusiGenesis
@slacker: I think I misspoke when I talked about messages. I am using callbacks, but still getting the slightly erratic timing. Also, I'm targeting Mac and iPhone as well as Windows (via Mono), so no DirectSound.
MusiGenesis
@Larry: I don't know if you'll read this or not, but I'm using waveOutGetPosition in place of the Stopwatch, and it works perfectly for 5-10 seconds and then locks up - eventually a call to waveOutGetPosition never returns. I've found a few links from people having the same problem (in Vista, like me) and they suggest it's from cross-threading calls causing a deadlock, but no solutions other than that. Are you familiar with this problem?
MusiGenesis
@MusiGenesis: You're probably doing something in a wave callback function - the set of operations you can do there is relatively limited because of the potential for deadlocks within winmm.dll. The page on waveOutProc (http://msdn.microsoft.com/en-us/library/dd743869(VS.85).aspx) describes the limitations.
Larry Osterman
@Larry: the only thing I'm doing in the waveOutProc is invoking another thread that loads up the next buffer and queues it with waveOutWrite. That part has been working perfectly for a long time. It's only when I try to call waveOutGetPosition that my app sometimes locks up.
MusiGenesis
@MusiGenesis:Isn't waveOut also Windows-only? Also, if you are already using a separate thread to feed the buffers, then why the heck are you using those callbacks instead of asking waveOut to notify your feeder thread directly? AFAIRC waveOut can be told to send notification messages to a specified thread, or to signal an event object.
slacker
@slacker: waveOut is Windows-only, and although it's a dangerous and finicky API, at its core you just open up an audio device and start sending it buffers (of samples) and it plays them seamlessly and notifies you when each buffer is completed, letting you re-load and re-queue them for continuous playback. It's a simple matter to abstract the waveOut details away from the rest of my program; Mac and Linux have similar APIs for this sort of thing so I'll only need a small amount of platform-specific PInvoke code. I couldn't do this with DirectX.
MusiGenesis
@MusiGenesis: Check the thread on which you're calling waveOutGetPosition - break into the debugger and look at the call stacks. I suspect you'll find that the thread's processing some other message OR there's another thread blocked on something. You're right that the entire MME family of APIs is finicky - it is unfortunately prone to deadlocks if you use callback functions.Have you tried shifting to using window messages instead?
Larry Osterman
@Larry: thanks for your reply, as always. In my first version of this project, I wrote the app to run both in normal windows and in windows mobile, and the wm version used window messages. At the time, I assumed that the callback events were regular enough to drive high-quality animation, so I'm kind of scrambling to replace that part. `waveOutGetPosition` is ideal for this, except for its habit of locking up my app after 5 seconds. I'm working on a simplified app that will hopefully replicate the problem. I'll tag it "larry-osterman" so you see it.
MusiGenesis
I repeat - waveOut does NOT have to send those messages to a window - it can send them to the message queue of whichever thread you want it to. You could have a thread that does nothing but fill those buffers and wait for completion messages. That would also avoid the supposed deadlock in the callback function.
slacker
@slacker: I managed to boil my problem down into a sample app that displays the problem: http://stackoverflow.com/questions/2452345/problem-with-waveoutwrite-and-waveoutgetposition-deadlockThere's a link there to source code.
MusiGenesis
A: 

Larry Osterman provided the answer in a comment:

FWIW, in Windows the video clock is driven by the audio clock. If you're rendering audio, you get the audio clock by calling waveOutGetPosition (since audio is isochronous, the position is directly corrolated with time). All the other audio rendering APIs have a similar "GetPosition" API that can be used to determine the audio rendering position.

MusiGenesis