Hi there!
Based on ideas presented in link I implemented several different "sleep methods". One of this methods was the "binary sleep", which looks like this:
while (System.currentTimeMillis() < nextTimeStamp)
{
sleepTime -= (sleepTime / 2);
sleep(sleepTime);
}
Because the check if the next time step is already reached takes place at the beginning I, would expect that the method is running too long. But the cummulative distribution of the simulation error (expected time - real time) looks like this:
Does somebody has an idea why I'm getting this results? Maybe the method System.currentTimeMillis() does not really return the current time?
BR,
Markus
@irreputable
When I made the evaluation I also created a bell curve by using a german statistic program. Because it was not possible to change caption, here is the english translation of all relevant items:
Häufigkeit = frequency
Fehler = error
Mittelwert = average
Std-Abw = standard deviation