If I have a video that plays at 30fps, then the duration of each frame is 1/30th of a second or 33.333333... milliseconds.
Assume you were implementing a video player, how would you handle the fact that the duration of each frame is represented by a repeating decimal?
For example, if you truncate the duration of the 1st 29 frames to 33.33 milliseconds, then the duration of the 30th frame would have to be slightly longer, 33.43 milliseconds, in order to maintain a 30fps rate.
Is there a standard way that video playback software handles this?