views:

355

answers:

2

I am not sure if this question belongs on StackOverflow but here it is.

I need to generate a timestamp using C# for some data which is to be transferred from one party to another party and I need to know what is the worst case precision of the system clock in all operating system (Windows, Linux and Unix)? What I need is to figure out the precision such that all operating systems are able to validate this timestamp.

As an example the clock's resolution for Windows Vista operating systems is approximately 10-15 milliseconds.

+1  A: 

Are you looking to generate something like a unix timestamp for the data? Or to find a timestamp that wont collide with an existing file? If its the later you could always use ticks.

The problem with any "long" timestamp is that it will be relative to the machine generating it but wont guarantee non-collision on the other system, as the clocks can be set differently (not float, but actually be set differently).

If the data is secure/sensitive and you are looking at a time-based mechanism for sync-ing keys (ALA Kerberos) I would not suggest rolling your own as there are many obstacles to overcome especially in sync-ing systems and keeping them in sync.

GrayWizardx
I am using the timestamp to indicate the time that the data is transferred from one party to the other party and it does not have to be unique. So, I am not looking for a timestamp that won't collide with an existing file.
Lopper
I would go with Ticks then as it should give you plenty of resolution, its clean and straight forward (its a long value) and can be easily reconstituted into a DateTime. If the client is Non-.NET, the Unix timestamp is a good option as well.
GrayWizardx
A: 

Interesting. The major operating systems have—at worst—centisecond resolution (0.01 seconds), though that's often embedded with more precision.

Linux offers up to microsecond resolution in its timestamps (see man utime) depending on the computer's clock hardware. Windows NT/Win2K/XP/etc. offer millisecond precision in file timestamps (using NTFS only) though it accounts for all system timestamps in 0.000 000 1 second units (ten million per second).

If accurate and precise time resolution is needed between systems, GPS receivers easily achieve 100 nanosecond precision as a side effect of how they work, and many inexpensive models do as well as 10 ns. Special GPS models make the derived time available for external use.

wallyk