Hello,
I have an entity that plays an animation that runs in my world at speed s = 1
.
Now starting with a specific time interval in my world it is possible for the animation to slow down, which means it plays at a speed s
where: 0 < s < 1
.
This time interval is defined by the starttime ta
and endtime tb
.
So if the time in my world reaches ta
, the animation's speed is reduced so that it plays slower, (like a slow motion effect) while everything else remains at it's usual speed.
Now somewhere in this interval ta and tb
, the animation stops to play slow and plays faster s > 1
, so that when the time reaches tb
, it catches up with the rest of the world.
My question is now, how fast has the speed after the slowing down to be, so that the animation catches up exactly? Given that :
- the timeinterval
ta
,tb
- the speedfactor by how much the animation is slowed, once
ta
is reached. - the time between
ta
andtb
, when the slow-effect is stopped and the fast-effect should be started.
I hope the question is understandable, if not please let me know. As an example, please imagine a machine that throws a ball in an arc, then moves along the floor with a constant speed and catches the ball. My case is now, that it first moves at a slower speed, but after a certain time, it has to increase it's speed so that it can catch the ball. What is that speed?