views:

2744

answers:

4

I'm in the process of making a game (a shmup) and I've started to question the accuracy of the timers in ActionScript. Sure, they're accurate enough when you want to time things on the order of a few seconds or deciseconds, but it seems to perform pretty poorly when you get to finer ranges. This makes it pretty tough to do things like have a spaceship firing a hundred lasers per second.

In the following sample, I tested how long (on average) the intervals were between 1000 timer ticks intended for 30ms. Time and time again, the results are ~35-36ms. Decreasing the time, I found the floor on the timer delay to be ~16-17ms. This gives me a max fps of ~60, which is great visually but also means I can't possibly fire more than 60 lasers per second :-(. I ran this test a few times at 100 and 1000 loops, but for both the 30ms test and the 1ms test the results didn't change. I print to a textField at the end because using trace() and launching the swf in debug mode seems to negatively affect the test. So what I'm wondering is:

  • Is this test a decent measure of the Timer class's performance, or are my results questionable?
  • Will these results change dramatically on other machines?

I understand this is dependent on the getTimer() method's accuracy, but the discussions I find on this topic usually center around getTimer()'s accuracy on larger intervals.

package 
{   
    import flash.display.Sprite;
    import flash.events.TimerEvent;
    import flash.text.TextField;
    import flash.utils.getTimer;
    import flash.utils.Timer;
public class testTimerClass extends Sprite
{
 private var testTimer:Timer = new Timer(30, 1000);
 private var testTimes:Array = new Array();
 private var testSum:int = 0;
 private var testAvg:Number;
 private var lastTime:int;
 private var thisTime:int;

 public function testTimerClass()
 {
  testTimer.addEventListener(TimerEvent.TIMER, step);
  testTimer.addEventListener(TimerEvent.TIMER_COMPLETE, printResults);
  lastTime = getTimer();
  testTimer.start();
 }

 private function step(event:TimerEvent):void
 {
  thisTime = getTimer();
  testTimes.push(thisTime - lastTime);
  lastTime = thisTime;
 }

 private function printResults(event:TimerEvent):void
 {
  while (testTimes.length > 0)
  {
   testSum += testTimes.pop();
  }
  testAvg = testSum / Number(testTimer.repeatCount);    
  var txtTestResults:TextField = new TextField();
  txtTestResults.text = testAvg.toString();
  this.addChild(txtTestResults); 
 }  
}

}

I guess the best route to take would be to just draw multiple lasers in the same frame with different positions and avoid having more than one Timer object.

edit: I used stage.frameRate to change the render frameRate and ran the test on several framerates but there was no change.

A: 

This gives me a max fps of ~60, which is great visually but also means I can't possibly fire more than 60 lasers per second :-(

I would say you are very lucky to be getting that kind of FPS as it is, and the majority of your users (assuming your audience is the internet at large) will more than likely not be achieving that kind of framerate.

I would agree that the abilities of the flash player are probably not sufficient for what you are trying to achieve.

Ryan Guill
the render engine seems to max at around ~56 on my machine, but I'm really just shooting for 30fps render time. I run the update cycle outside of the display list, then use bitmapData.copyPixels to draw each object to a single bitmap, so there's only one bitmap on the screen. The problem with syncing to the render cycle is that if you strafe your ship across the screen with something like 30 lasers/second there can be a lot of space between them, and it makes it more difficult to shoot enemies.
jorelli
Of course every app needs its own profiling, but I'd be pretty leery of doing your work off the display list and copying things onscreen as a bitmap. The FP's rendering is obviously pretty heavily optimized, and doing it that way you bypass all its optimizations, or at the very best you do your own optimizations in slower non-native code.
fenomas
+3  A: 

Tinic Uro (Flash Player engineer) has written an interesting blog post about this issue some time ago.

Luke
+1 Nice link to Tinic's post. Some useful insight in there.
Christian Nunciato
You know, I didn't really appreciate what he was talking about until I ran my test in a browser vs in the standalone Flash player. The reason for that is that he talks all about frame rates, and in my mind all I thought was "oh that's great and all, but I'm not talking about an actual frame rate, I'm talking about timers". I wasn't thinking in my head that even though they're independent of one another, they're both still taking their timing cues from the environment, so they have the same frequency ceiling.
jorelli
@jorelli: That is exactly right.
Luke
A: 

I just tried running your sample code, exactly as-is except as a frame script, and where I'm sitting Timer works exactly as you'd expect. With a 30ms timer, the average comes out about 33-34, and with a 3ms timer it comes out around 3.4 or 3.5. With a 1 ms timer I get between 1.4 and 1.6 over a thousand trials. It works that way in Flash and in the browser.

So as for the accuracy, see Tinic's blog post in Luke's answer. But for the upper limit on frequency, if you're getting events no faster than 16ms apart with only the sample code you posted, either something's weird, or maybe that's the upper limit on how fast your browser is giving Flash timing messages. If you're getting those results in your actual game, I think you simply have synchronous code that blocking the timer events.

One more thing - I know it's not what you asked, but it would really be wiser to handle your game logic in an ENTER_FRAME handler. It doesn't matter whether the code to spawn lasers runs every 30ms or every 3ms, the screen only gets redrawn once per frame (unless you're forcing it to update more often, which you probably shouldn't be). So whatever happens between screen refreshes is just overhead that lowers your overall framerate, since with a little cleverness it ought to be possible to achieve the exact same results as you'd get from a timer that executed more frequently. For example, instead of spawning a laser every 3ms, you could spawn 10 lasers every 30ms, and move each one a certain distance along its flight path, as if it had been created 3 or 6 or 9ms earlier.

Another natural way to run your logic off frame events would be to make a game loop that "theoretically" runs every T milliseconds. Then, in your ENTER_FRAME handler, simply iterate that loop F/T times, where F is the number of ms that have elapsed since the last frame event. Thus if you want the game to theoretically update every 5ms, and you want the screen to update at 30FPS, you'd simply publish at 30FPS, and in your ENTER_FRAME handler you'd call your main game loop 5-7 times consecutively, depending on how long it had been since the last frame. Onscreen, this will give you the same results as if you had used a 5ms timer event, and it will dispense with much of the overhead.

fenomas
Holy crap, I'm a retard. The results are totally dependent on the environment you run it in. I only ever view my swfs in a browser (Firefox 3 on OS X) because that's Flex Builder's behavior when you run the project, but when I did it in Flash it played in the standalone player and I saw the difference. HUGE! So it's not really Flash's fault, it's the restrictions on browser plug-ins that's doing me in.Re: One more thing - I use a single bitmap object on the display tree. On the game's draw cycle, I lock the bitmap, draw all the objects to it with copyPixels, and then unlock it.
jorelli
Yes, it's not really Flash's fault but it's still something you have to live with. As for the bitmaps, you're doing that the right way so far as it goes, but I really think you'd get better performance leaving the rendering to Flash. Even if all other things were equal, it's still a question of doing the compositing in AS3 vs. doing it in C, after all. (YMMV of course, profiling > theorizing...)
fenomas
Running multiple update cycles seems so obvious now! Thanks, I'm getting 120 lasers per second easily now. That should be enough lasers for the time being :P
jorelli
A: 

Intead of using the timer class at all, I would put my update code in and enterframe eventlistener.

Then use getTimer(), to find out how much time has passed since last update, and then a bit of bit of math to calculate movement, laser "spawning" etc. This way the game should perform the same no matter if you getting 60FPS or 10FPS.

This also prevents weird game behavior if the the FPS drop temporary, or if the game is running on a slow machine.

For the most part the math is pretty simple, and when it is not, you can usually "cut corners" and still get a result good enough for a game. For simple movement you can simply multiply movement offset with time since last update. For acceleration, you need a second degree equation, but could simplify be multiplying again.

In response to jorelli comment below:

The collision detection problem can be solved by better code, that does not just check the current state, but also what happened between the previous state and the current. I've never had problems with the keyboard. Maybe you should try this nifty little class: http://www.bigroom.co.uk/blog/polling-the-keyboard-in-actionscript-3

Lillemanden
The frames are too inconsistent. What ends up happening is that when a single frame goes on for too long, your objects pause, then teleport. Throw in collision detection, and objects can now pass through one another. Additionally, because the objects take cues from the keyboard, you effectively can't alter anything from the keyboard during the lag time. I've found it better to just sync ahead of time and assume all is going to plan. That way, when the framerate slows down, your game slows, but objects don't teleport and you don't lose control of your avatar.
jorelli