I have written a service that monitors a file drop location for files from a scanner. The scanner drops all files with the exact same file name (eg. Test.tif) unless that file already exists and then it appends on a timestamp on the end (eg. Test_0809200915301900.tif).
So when I process these files I attach a 'tag' to the db entry to reflect this specific file which is the filename plus the file creation timestamp in ticks. Each scanner can produce 1 scan every few seconds at best so precision to the second is sufficient.
Here is the code that generates this supposedly unique tag:
FileInfo fileInfo = new FileInfo(filePath);
string tag = string.Format("{0}_{1}", filename,
fileInfo.CreationTimeUtc.Ticks.ToString());
The generated tag would look something like: Test1.tif_633931295923017954
For some reason though when a bunch of scans come in from the same scanner say over the course of 20 seconds (eg. 1 scan, then 5 seconds later another, then 5 seconds later another, etc) the are getting the exact same file creation time stamp.
Eg.
- File in: Test1.tif
- Picked up and stored with tag
Test1.tif_633931295923017954
- Test1.tif is deleted.
- File in: Test1.tif (5 seconds later)
- Picked up and fail to be stored because generated tag is a duplicate with
Test1.tif_633931295923017954
How is this possible? The ticks are identical. I inspected the creation time object and it is identical as well even though I physically saw it created 5 seconds after the first one.
Edit: Can anyone recommend a solution to ensuring I am dealing with a unique file? I thought that filename + creation timestamp should be a good enough check but obviously it is not. I don't have the ability to turn off the 'Tunnelling' functionality that Windows is preforming.
Edit: I ended up having the process rename each file and appending a guid. The process that then processed the files looked for files with the guid attached only. This ensured only unique files were processed.