views:

62

answers:

2

I have written a 3D-Stereo OpenGL program in C++. I keep track of the position objects in my display should have using timeGetTime after a timeBeginPeriod(1). When I run the program with "Start Debugging" my objects move smoothly across the display (as they should). When I run the program with "Start without debugging" the objects occationally freeze for several screen refreshes then jump to a new position. Any ideas as to what may be causing this problem and how to fix it?

Edit: It seems like the jerkiness can be resolved after a short delay when I run through "Start without debugging" if I click the mouse button. My application is a console application (I take in some parameters when the program first starts). Might there be a difference in window focus between these two options? Is there an explicit way to force the focus to the OpenGL window (in full screen through glutFullScreen();) when I'm done taking input from the console window?

Thanks.

A: 

The most common thing that causes any program to behave differently while being debugged and not being debugged is using uninitialized variables and especially reading uninitialized memory. Check that you're not doing that.

Something more OpenGL specific - You might have some problems with flushing of commands. Try inserting glFinish() after drawing every frame.
It might also be helpful to somehow really make sure that when the freeze occurs there are actually frames being rendered and not that the whole application is frozen. If there are its more likely that you have some bug in the logic since it seems that OpenGL does its job.

shoosh
Thank you for the advice. Any suggestion on how to determine whether frames are still being rendered during the apparent freeze? I had tried to check that before but found that the slow down due to writing a file was obscuring whatever it was that was going on. As this is a stereo display @ 120Hz I have about 8.33ms to get through my entire display loop.
drknexus
Usually a debug build will initialize memory in exactly the same way whether run through a debugger or not. On windows you would still be using the debug runtime libraries. It can make a big difference when moving to a release build.
Charles Bailey
+1  A: 

The timeGetTime API only has a precision of something like 10ms. If the intervals you're measuring are less than 50ms or so, you may simply be seeing the effects of the expected variance in the system timer. I have no idea why the debugger would have an effect on this, but then the whole workings of the system are a black box. You could use the QueryPerformanceCounter to get higher-resolution timings, which may help.

Tim Sylvester
Thanks, I'll take a look at that. I had thought that timeBeginPeriod(1) was accurately setting the resolution down to 1ms. If the resolution is @ 10ms that could definately cause some problems for my code.
drknexus
@drknexus The documentation for timeGetTime and timeBeginPeriod implies that it's 1ms, but it seems to be highly dependent on what hardware is available. I suppose it's possible that "modern" hardware has fixed this, it's been a few years since I played with it. This is where I remember the 10ms value from, but it's also a few years old: http://support.microsoft.com/kb/172338
Tim Sylvester