When I wait on a non-signaled Event using the WaitForSingleObject function, I find that in some cases the call will return WAIT_TIMEOUT in less than the specified timeout period. Simply looping on the call with a timeout set to 1000ms, I've seen the call return in periods as low as 990ms (running on WinXP). I'm using QueryPerformanceCounter to get a system-clock independent time measurement, so I don't think clock drift is likely to be an answer.
This behavior doesn't present any practical problems for me, but I'd like to understand it better. It looks like it may be working at roughly the resolution of a timer tick. Does Microsoft publish any further details on the precision of this function? Should I expect greater precision in Vista?