views:

1546

answers:

5

After working for a while developing games, I've been exposed to both variable frame rates (where you work out how much time has passed since the last tick and update actor movement accordingly) and fixed frame rates (where you work out how much time has passed and choose either to tick a fixed amount of time or sleep until the next window comes).

Which method works best for specific situations? Please consider:

  • Catering to different system specifications;
  • Ease of development/maintenance;
  • Ease of porting;
  • Final performance.
A: 

My experience is fairly limited to somewhat simple games (developed with SDL and C++) but I have found that it is quite easy just to implement a static frame rate. Are you working with 2d or 3d games? I would assume that more complex 3d environments would benefit more from a variable frame rate and that the difficulty would be greater.

jwarzech
+1  A: 

It seems that most 3D developers prefer variable FPS: the Quake, Doom and Unreal engines both scale up and down based on system performance.

  • At the very least you have to compensate for too fast frame rates (unlike 80's games running in the 90's, way too fast)
  • Your main loop should be parameterized by the timestep anyhow, and as long as it's not too long, a decent integrator like RK4 should handle the physics smoothly Some types of animation (keyframed sprites) could be a pain to parameterize. Network code will need to be smart as well, to avoid players with faster machines from shooting too many bullets for example, but this kind of throttling will need to be done for latency compensation anyhow (the animation parameterization would help hide network lag too)
  • The timing code will need to be modified for each platform, but it's a small localized change (though some systems make extremely accurate timing difficult, Windows, Mac, Linux seem ok)
  • Variable frame rates allow for maximum performance. Fixed frame rates allow for consistent performance but will never reach max on all systems (that's seems to be a show stopper for any serious game)

If you are writing a networked 3D game where performance matters I'd have to say, bite the bullet and implement variable frame rates.

If it's a 2D puzzle game you probably can get away with a fixed frame rate, maybe slightly parameterized for super slow computers and next years models.

Jacob Rigby
+1  A: 

One option that I, as a user, would like to see more often is dynamically changing the level of detail (in the broad sense, not just the technical sense) when framerates vary outside of a certian envelope. If you are rendering at 5FPS, then turn off bump-mapping. If you are rendering at 90FPS, increase the bells and whistles a bit, and give the user some prettier images to waste their CPU and GPU with.

If done right, the user should get the best experince out of the game without having to go into the settings screen and tweak themselves, and you should have to worry less, as a level designer, about keeping the polygon count the same across difference scenes.

Of course, I say this as a user of games, and not a serious one at that -- I've never attempted to write a nontrivial game.

theorbtwo
+3  A: 

I lean towards a variable framerate model, but internally some systems are ticked on a fixed timestep. This is quite easy to do by using a time accumulator. Physics is one system which is best run on a fixed timestep, and ticked multiple times per frame if necessary to avoid a loss in stability and keep the simulation smooth.

A bit of code to demonstrate the use of an accumulator:

const float STEP = 60.f / 1000.f;
float accumulator = 0.f;

void Update(float delta)
{
    accumulator += delta;

    while(accumulator > STEP)
    {
        Simulate(STEP);
        accumulator -= STEP;
    }
}

This is not perfect by any means but presents the basic idea - there are many ways to improve on this model. Obviously there are issues to be sorted out when the input framerate is obscenely slow. However, the big advantage is that no matter how fast or slow the delta is, the simulation is moving at a smooth rate in "player time" - which is where any problems will be perceived by the user.

Generally I don't get into the graphics & audio side of things, but I don't think they are affected as much as Physics, input and network code.

Paul
+1  A: 

The main problem I've encountered with variable length frame times is floating point precision, and variable frame times can surprise you in how they bite you.

If, for example, you're adding the frame time * velocity to a position, and frame time gets very small, and position is largish, your objects can slow down or stop moving because all your delta was lost due to precision. You can compensate for this using a separate error accumulator, but it's a pain.

Having fixed (or at least a lower bound on frame length) frame times allows you to control how much FP error you need to take into account.

Don Neufeld