views:

227

answers:

3

I'm caching files locally in my 3 tier app. To check whether to read the file from the local cache or from the server I compare file dates. I've found that there when converting file dates to TDateTime and viceversa there are inconsistencies and values that should match rarely do. Here's some code that demostrates the problem

procedure TestFileDateConversion;
const
  Dir = 'c:\TestDir\';
  Filename = 'test.txt';
var
  FileDate, NewFileDate: TDateTime;
  FilePath: String;
  FileHandle: THandle;
begin
  ForceDirectories(Dir);

  FilePath := concat(Dir, Filename);

  // Create the file if it doesn't already exist
  FileCreate(FilePath);

  FileDate := now;

  // Set the file date
  try
    FileHandle := FileOpen(FileName, fmOpenWrite OR fmShareDenyNone);

    if FileHandle > 0 Then
      FileSetDate(FileHandle, DateTimeToFileDate(FileDate));
  finally
    FileClose(FileHandle);
  end;

  // Check that the expected file date and the actual file date match
  if (FileAge(FilePath, NewFileDate)) and (FileDate <> NewFileDate) then
    ShowMessage('File dates do not match'); // More often than not, they don't
end;

I'm sure this is caused by some rounding issue. Does anybody know a way to fix it?

+4  A: 

You are right about the rounding. A TDateTime is actually a float and as all floating points you have rounding issues. Especially comparing for equality is a problem. Functions like CompareDateTime can help. Also some file systems do not have the same precision like a TDateTime. Some file systems only have a 2 second precision. So you might need to decide to use less precision for comparing by using the SecondsBetween function for example.

Lars Truijens
One would assume that the server is using NTFS, which has the same resolution as`FILETIME`, i.e. 100 nanoseconds. Anyway, comparing the retrieved timestamps directly should eliminate the need to round or introduce an epsilon, as long as there's no data type conversion. `TDateTime` is simply a bad data type to use.
mghie
FAT rounds to even numbers: http://support.microsoft.com/kb/127830. I like your suggestion to use CompareFileTime, but I would still test if it works as expected for FAT.
Lars Truijens
+1  A: 

Compare yout TDateTime's using the 'SameValue' function in the Math unit. This performs a 'fuzzy' comparison returning equality if the two values are very close to each other (within a default delta which you can modify if you like). Your rule should be: NEVER EVER perform

If FloatA = FloatB then
   ....

It is ok to do:

If FloatA = 0.0 then
  ....

but that's it.

Brian.

Brian Frost
Thanks Brian, I'll try your suggestion.
norgepaul
A: 

It's not a good idea to update files using the system datetime, if you have pcs in different time zones you will need to ride it. use instead a hash function to know when the file is different and an update is needed.

Hashing files is easy, take a look at http://delphi.about.com/od/objectpascalide/a/delphi-md5-hash.htm for example.

Anyway getFileTime has a lower resolution than tdatetime (bigger in older systems) but only for seconds, then you need to make a round in your comparison.

From MSDN "Not all file systems can record creation and last access times and not all file systems record them in the same manner. For example, on FAT, create time has a resolution of 10 milliseconds, write time has a resolution of 2 seconds, and access time has a resolution of 1 day (really, the access date). Therefore, the GetFileTime function may not return the same file time information set using SetFileTime. NTFS delays updates to the last access time for a file by up to one hour after the last access."

Francis Lee
While I agree that local datetime is a bad basis for comparisons, system datetime (UTC) certainly is a good one. See the documentation of `GetLocalTime()` and `GetSystemTime()` for details. Your answer mixes both, you should clarify that. And one would use `FILETIME` for comparisons, which has a resolution of 100 nanoseconds.
mghie
You are right, thanks
Francis Lee