On every frame of my application, I can call timeGetTime() to retrieve the current elapsed milliseconds, and subtract the value of timeGetTime() from the previous frame to get the time between the two frames. However, to get the frame rate of the application, I have to use this formula: fps=1000/delay(ms). So for instance if the delay was 16 milliseconds, then 1000/16=62.5 (stored in memory as 62). Then let's say the delay became 17 milliseconds, then 1000/17=58, and so on:
1000/10=100
1000/11=90
1000/12=83
1000/13=76
1000/14=71
1000/15=66
1000/16=62
1000/17=58
1000/18=55
1000/19=52
1000/20=50
As you can see for consecutive instances for the delay, there are pretty big gaps in the frame rates. So how do programs like FRAPS determine the frame rate of applications that are between these values (eg 51,53,54,56,57,etc)?