views:

21

answers:

1

I'm programatically dragging the play head back and forth along a timeline based on a mouse event listener. The faster the mouse drags the faster the playhead rips through the frames (it's linearly proportional).

I'm also firing a starter=getTimer(); event at the inital mouse_move Event trigger (and only on the initial event) and then firing a ender=getTimer() at the end of the code that calculates the new playhead position, but only if the calculation has resulted in the playhead being told to move at least one frame (the mouse can move a few pixels before playhead will move one frame). In essence it's giving me a frame per second rate.

If I look a the delta of the two timers I rarely get anything but 0. Occasionally I'll get a 1 but not very often. There are a good few dozen heavy lines of code that need to be worked out between the initial mouse_move and the decision to move the play head, and this has to be done for a number of mouse position changes before the playhead will move at all and the second getTimer call is triggered. Does the mouseEvent trigger every ms or does it trigger at the frame rate (enterframe rate) of the doc. If so the smallest delta I should see is 31ms.

This runs as an app in the standalone swf player (10.1) and never in a browser. Are my results indicative of this? Can the compiler really run this fast? How does one do very small timing tests that appear to be beyond the millisecond clock granularity?

Have I missed something obvious?

A: 

It's really hard to understand, what your actual problem is. I suggest, you reduce your problem to a minimum and show us some code. I think you're having a problem understanding the AS3 event model. Calling getTimer certainly does not mean firing an event. Also, MouseEvents are frame rate independent.

back2dos