I am not sure if this question belongs on StackOverflow but here it is.
I need to generate a timestamp using C# for some data which is to be transferred from one party to another party and I need to know what is the worst case precision of the system clock in all operating system (Windows, Linux and Unix)? What I need is to figure out the precision such that all operating systems are able to validate this timestamp.
As an example the clock's resolution for Windows Vista operating systems is approximately 10-15 milliseconds.