views:

181

answers:

9

Hello,

I'm designing a game for the first time, but I wonder on what game time is based. Is it based on the clock or does it rely on frames? (Note: I'm not sure if 'game time' is the right word here, correct me if it isn't)

To be more clear, imagine these scenarios:

  1. Computer 1 is fast, up to 60fps
  2. Computer 2 is slow, not more than 30fps

On both computers the same game is played, in which a character walks at the same speed.

If game time is based on frames, the character would move twice as fast on computer 1. On the other hand, if game time was based on actual time, computer 1 would show twice as much frames, but the character would move just as fast as on computer 2.

My question is, what is the best way to deal with game time and what are advantages and disadvantages?

+3  A: 

It should rely on the system clock, not on the number of frames. You've made your own case for this.

Will
A: 

For the users of each to have the same experience you'll want to use actual time, otherwise different users will have advantages/disadvantages depending on their hardware.

John Boker
+4  A: 

Really old games used a frame-count. It became fairly obvious quickly that this was a poor idea, since machines get newer, and thus the games run faster.

Thus, base it on the system clock. Generally this is done by knowing how long last frame took, and using that number to know how much 'real time' to go through this frame.

zebediah49
+1: Heh. Anyone remember *Wing Commander*...One of the worst series for that problem. Updated hardware made them unplayable. Now games interpret time based on system time, and the graphical system queries the state of the game objects, and draws a picture.
Satanicpuppy
Really old games used *timer ticks* not frame count. A faster clock (8 MHz vs 4.77 MHz) meant there were more ticks in once second, and since something happened, say, every 10 ticks, the game would run faster and draw to the screen more often. Increased frame-rates were a *consequence* of faster ticks, but were not the basis for movement themselves.
Stephen P
+1  A: 

The FPS is simply how much frame the computer can render per second.

The game time is YOUR game time. You define it. It is often called the "Game Loop". The frame rendering is a part of the game loop. Also check for FSM related to game programming.

I highly suggest you to read a couple of books on game programming. The question you are asking is what those book explain in the first chapters.

Pierre 303
A: 

Games should all use the clock, not the frames, to provide the same gameplay whatever the platform. It is obvious when you look at MMO or online shooter games: no player should be faster than others.

Fififox
A: 

It depends on what you're processing, what part of the game is in question.

For example, animations, physics and AI need to be framerate independent to function properly. If you have a FPS-dependent animation or physics thread, then the physics system will slow down or character will move slower on slower systems and will go incredibly fast on very fast systems. Not good.

For some other elements, like scripting and rendering, you obviously need it to be per-frame and so, framerate-dependent. You would want to process each script and render each object once per frame, regardless of the time difference between frames.

peachykeen
A: 

Game must rely on system clock. Since you don't want your game is played in decent computers in notime!

algorian
+4  A: 

In general, commercial games have two things running - a "simulation" loop and a "rendering" loop. These need to be decoupled as much as possible.

You want to fix your simulation time-step to some value (greater or equal to your maximum framerate). Complex physics doesn't like variable time steps. I'm surprised no-one has mentioned this, but fixed-time steps versus variable time steps are a big deal if you have any kind of interesting physics. Here's a good link: http://gafferongames.com/game-physics/fix-your-timestep/

Then your rendering loop can run as fast as possible, and render the output of the current simulation step.

So, referring to your example:

You would run your simulation at 60fps, that is 16.67ms time step. Computer A would render at 60fps, ie it would render every simulation frame. Computer B would render every second simulation frame. Thus the character would move the same distance in the same time, but not as smoothly.

Justicle
That's interesting; I've never had trouble with variable-timestep physics before. I suppose I've either done relatively simple stuff, or have locked the step-size (for scientific stuff) though. I'd think that using finer timesteps on a faster machine would just make it even smoother and better. Yes, certain physics things wouldn't work well at too low of a rate, but fixing it higher and having the computer fail doesn't seem much better.
zebediah49
I guess "multiplayer" might be the definition of "interesting". Probably overkill for this particular case, but its worth noting from the point of view that timesteps are a tricky subject.
Justicle
Differing timesteps make things smoother but can also change gameplay. E.g. in some (most) games that don't do proper integration of motion, if your timestep changes due to a virus scanner kicking in you can get vastly different simulation results (e.g. I believe in Quake if you triggered high load just as you did a jump, you could pretty much jump anywhere in the level?).
dash-tom-bang
A: 

Games typically use the highest resolution timer available like QueryPerformanceCounter on Windows to time things. Old games used to use frames, but after you could literally run faster in Quake by changing your FPS, we learned not to do that anymore.

DeadMG