I'm looking into writing a simple synchronization ability into my app and one of the concerns that has popped up is synchronization of time between two remote computers, each with their own clock (in particular concerning the modification dates of files/objects).
I'm sure a lot of research has been done on this topic and don't want to get too theoretical, but I'm wondering if there are any accepted best practices for minimizing temporal discrepancies between remote clocks?
For example, a start is to always use universal time (UTC) as that avoids timezone problems, but there is no guarantee that two computers will have exactly the same system time. Luckily the work I'm doing isn't very fine-grained, so it's not a terribly important concern, but I'm still curious nonetheless.
One solution would be to always use the same clock on both ends, such as a global time server, rather than the local system clock. Presumably this (combined with shared resource locks) could guarantee no accidental overlap of synchronized time, but it's not very practical.
One thought that just popped into my head would be to synchronize each node (each client) with an offset calculated at some point prior, perhaps by calculating the offset of the system clock with a global time server. This would only need to be done occasionally as the offset itself would not likely change greatly over a short period of time.
Update: Let me just add that I'm not interested in actually synchronizing the system clocks of two computers--I'll presume that the operating system will handle this in most cases. This is just a question of how to ensure two instances of an application are using synchronized times, though in this day and age I suppose the system clocks would almost assuredly be synchronized to within some very small delta anyway.