tags:

views:

58

answers:

0

I use a 3rd party library called physx in our engine and one of the problems I've run into is correctly settings the SolverIterationCount for actors in the scene.

The SolverIterationCount is used to increase the number of iterations it take for joints to correct the error between two actors they're connected to. Because of this, you can set the solver count to low as 4 and high as 255. Though, I would very much like to have this determined during run-time.

I have attempted the following.

Set the minimal fps at 15fps, and the minimal solver count to 4.
Set the maxiumum fps at 40, and the maxium solver count to 128.

During the running of the simulation I get the average frame time over 16 frames. From there, I linearly interpolate between 12fps, 40fps, and retrospectively 4 solver count and 128 solver count with the average frame time as the source.

This has worked really well, and when the frame rate drops bellow 20 the solver adjust itself accordingly, but have run into one little problem.

The problem being, that because of the solver is determined based on the frame rate the result happens. The code notices that frame rate is low so it reduces the solver count. The frame rate goes back up and the code notices that it has more frame rates it can use and shoves up the solver iteration count again. The result is a yo yo kind of effect on the joints and the actors.

Any idea how best to approach this problem. To include a variable to check the last time it increase/decreased the solver and to wait for a specific timeout if this feedback loop starts occurring?