Hi,
Is there a high precision timer that I can use to benchmark the amount of time a series of operations/statements take when executing ? A timer that returns the number of seconds since epoch would be insufficient, as the resolution is in seconds.
I am thinking that the general implementation would be:
- Get starting time in milliseconds.
- Execute statements.
- Get ending time in milliseconds.
- Elapsed time = end - start.
Thanks,
Scott