In ruby, how accurate is File.atime / File.ctime / File.mtime ? Is it to the nearest second on both Unix and Windows?
Presuming you mean precise (since accuracy would depend on the hardware clock, and many other factors), the docs seem to indicate second precision.
EDIT: You're right. Time actually has microsecond precision (again, not accuracy).
I suspect most, if not all, of the "scripting" languages use the underlying OS calls for retrieving this sort of information and, if that is the case, then 1-second resolution is what you get under Linux and Windows (at least for NTFS - FAT32, if you're still using it, has a 2-second resolution IIRC).
Ideally, you'd look at the Ruby source code to confirm this (actually, ideally, it would be documented, but that may be too much to hope for) but you could run a simple test where you continuously "touch" the file (more than once per second) and monitor its mtime
values.
In the absence of a documented statement, you either have to rely on empirical evidence or rely on nothing at all.
It depends on the file system.
FAT has 2 second precision. NTFS is 100 nanoseconds. EXT3 is 1 second. It looks like there are extensions to make this 1 nanosecond.
The 2 second FAT thing can bite you if you are backing up based on time stamp between FAT and NTFS or EXT3.