Imagine you have a very simple game, where it's just a ball moving across the screen. Without time-based updates, it moves as fast as you update.
What you want to do is find out how much time has elapsed (in a fraction. I usually measure in seconds, so physics equations match better.) When updating, instead of something like this:
ballPosition += ballVelocity
You'd have this:
ballPosition += ballVelocity * timeElapsed
What this means is that for higher frame rates, timeElapsed
will be lower, which consequently moves the ball less. Lower frame rates means that timeElapsed
will be greater, and the ball will move more per-frame.
In the end, the ball will move the same distance independent on frame rate. A 60 FPS update rate makes timeElapsed
equal 0.01666667f
, while a 30 FPS update rate would make it 0.03333333f
. You can see how at 60 FPS, the elapsed time is half of 30 FPS, but because it's twice as fast, it's the same number.
I usually pass timeElapsed
as an argument to any functions that are time-dependent. A nice consequence of doing it this way is you can slow down or speed up your game by multiplying the elapsed time by a value. You can also apply that to individual components. It also plays well if you switch to a frame-limiting model instead, because you're effectively just forcing timeElapsed
to be a constant. Pseudo-code:
while (gameRunning)
{
const float timeElapsed =
timer.elapsed(); // elapsed returns the number of seconds
// passed since it was last called
// GlobalTimeScale is 1 for normal time
game.update(timeElapsed * GlobalTimeScale);
game.draw();
}
To get the time, GetTickCount
should work. You might also take a look at QueryPerformanceCounter
for higher precision, though it can have issues with multiple cores.