views:

1048

answers:

5

I have the following sub:

  Private Sub Watcher_Changed(ByVal sender As System.Object, ByVal e As FileSystemEventArgs)
        If Path.GetExtension(e.Name) = ".p2p" Then
            Exit Sub
        Else
            Try
                ' multiple change events can be thrown. Check that file hasn't already been moved.
                While Not File.Exists(e.FullPath)
                    Exit Try
                End While

                ' throw further processing to a BackGroundWorker
                ChangedFullPath = e.FullPath
                ChangedFileName = e.Name

                FileMover = New BackgroundWorker
                AddHandler FileMover.DoWork, New DoWorkEventHandler(AddressOf ProcessFile)
                FileMover.RunWorkerAsync()
            Catch ex As Exception
                MessageBox.Show(ex.Message)
            End Try

        End If
    End Sub

I'm still getting multiple changed-file notifications when a file is being uploaded by FTP.

I want to modify the Try so it also throws out the change notification if it has happened within the past (time) -- let's say 3 seconds. It ought to be trivial, but for some reason it isn't coming to me today, and my mind isn't wrapping itself around the answers I'm finding on Google.

Thanks, Scott

+1  A: 

we had a similar situation; however the system that wrote the file in batches used a temp name, and changed it to the "real" name when it was done. Can your ftp client do something similar? If it can, then you can check the new filename and check it by extension or a prefix; if its not the expected final filename format then you can ignore it.

Victor
+1  A: 

I ran into the exact situation before, and ended up implementing a timer mechanism to wait for the file to "settle", or for write events to stop coming in for x amount of time.

A bit kludgy, but it worked.

+5  A: 

I created a service to upload files dumped to an FTP folder a couple years ago and here's some things I did that should help you get over your problem:

  1. I tested all the NotifyFilters and chose the one that cut out duplicate notifications (using only NotifyFilters.Size gave me reliable notifications for every file created, and eliminated nearly all duplicate notifications)
  2. In the Watcher_Changed event I ignored the file included in the event args and just processed all files currently in the folder; I try to add all files in the folder to a queue, and if they're already in the queue, I skip 3 below.
  3. I spin off a new thread for every unique new file found; my thread tries to acquire a lock on the file, and if it can't that means some other process is still accessing it (it's still being uploaded, etc.) so my thread sleeps and tries again after a short interval.
  4. When a file is completely processed, the last thing the thread does is move it to an archive folder, and then remove it from the queue, so it isn't accidentally processed again.

This seemed to work out for me and that service ran reliably uploading files for months until I left that job.

sliderhouserules
A: 

One thing to note is that accessing the file can change one of the file properties, like "accessed time" or something. That can then fire off another event if you don't have the notification filters setup right.

Scott Whitlock
+1  A: 

I had this same issue, I was saving a file that was uploaded through code and deleting and creating would fire twice each time. All I had to check was that the File was there before doing my processing.

private void fsw_Created(object sender, FileSystemEventArgs e)
{
   if (File.Exists(e.FullPath))
   {
      //process the file here
   }
}
Matt Palmerlee