I need to measure the time between character events on a serial port, preferably under Windows 7.
I have read in various places, statements like "Windows will not provide greater resolution than 10ms", but I have not been able to find out what that really means.
Is the problem that the OS will not deliver the events with greater accuracy, or is it that the timing functions (for measuring) used to be that "poor"? (I'm thinking GetTickCount)
If the latter, then I guess I should be able to use something like QueryPerformanceCounter for good-enough measuring, but if the events aren't delivered with similar accuracy, that obviously won't help.