My game uses a d = vt calculation for movement of objects where t is the time since the last frame (one frame per loop).
I'm using SDL and the gist of the timing calculation is that I create an instance of a Timer class and start it. I call GetSeconds() when it's needed which returns the difference between when the timer was started and the current time (divided by 1000 because everything is in milliseconds).
Ex:
return (SDL_GetTicks() - m_StartingTicks) / MILLISECONDS_PER_SECOND;
After each loop the timer is reset. I.e. m_StartingTicks = SDL_GetTicks()
However, I recently changed this so that it's only reset if m_StartingTicks is < SDL_GetTicks
, but it didn't fix the problem.
This was all hunky dory until I recently wrote a game engine to handle different game states and various other things which are used in my main game loop. This seriously improved performance, but unfortunately each iteration of the game loop is now occuring in less than 1 millisecond so when I pass GetSeconds(), 0 is returned and things on the screen don't move.
The easiest way to handle this is a simple kludge where if SDL_GetTicks() - m_StartingTicks) == 0
I change it to a 1 (as in 1 millisecond instead of 0). I don't really like this though, and I'd like to hear any suggestions, fixes, improvements, etc.
If you need more info I'd be happy to offer it.