Up until now I used DateTime.Now
for getting timestamps, but I noticed that if you print DateTime.Now
in a loop you will see that it increments in descrete jumps of approx. 15 ms. But for certain scenarios in my application I need to get the most accurate timestamp possible, preferably with tick (=100 ns) precision. Any ideas?
Update:
Apparently, StopWatch
/ QueryPerformanceCounter
is the way to go, but it can only be used to measure time, so I was thinking about calling DateTime.Now
when the application starts up and then just have StopWatch
run and then just add the elapsed time from StopWatch
to the initial value returned from DateTime.Now
. At least that should give me accurate relative timestamps, right? What do you think about that (hack)?
NOTE:
StopWatch.ElapsedTicks
is different from StopWatch.Elapsed.Ticks
! I used the former assuming 1 tick = 100 ns, but in this case 1 tick = 1 / StopWatch.Frequency
. So to get ticks equivalent to DateTime use StopWatch.Elapsed.Ticks
. I just learned this the hard way.
NOTE 2:
Using the StopWatch approach, I noticed it gets out of sync with the real time. After about 10 hours, it was ahead by 5 seconds. So I guess one would have to resync it every X or so where X could be 1 hour, 30 min, 15 min, etc. I am not sure what the optimal timespan for resyncing would be since every resync will change the offset which can be up to 20 ms.