views:

164

answers:

5

I've created a program that moves files to and from various directories. An issue I've come across is when you're trying to move a file and some other program is still using it. And you get an error. Leaving it there isn't an option, so I can only think of having to keep trying to move it over and over again. This though slows the entire program down, so I create a new thread and let it deal with the problem file and move on to the next. The bigger problem is when you have too many of these problem files and the program now has so many threads trying to move these files, that it just crashes with some kernel.dll error. Here's a sample of the code I use to move the files:

    Public Sub MoveIt()
    Try
        File.Move(_FileName, _CopyToFileName)
    Catch ex As Exception
        Threading.Thread.Sleep(5000)
        MoveIt()
    End Try
End Sub

As you can see.. I try to move the file, and if it errors, I wait and move it again.. over and over again.. I've tried using FileInfo as well, but that crashes WAY sooner than just using the File object.

So has anyone found a fool proof way of moving files without it ever erroring?

Note: it takes a lot of files to make it crash. It'll be fine on the weekend, but by the end of the day on monday, it's done.

UPDATE!!!

I appreciate all the ideas so far. Perhaps I should give more information about what I'm doing.

This is all done in a windows service. The files MUST be moved. There's no way I can leave any behind. which is why I must try OVER and OVER again to move these files. The files are used to import data into various databases. Plus there is NO user to tell if the file cannot be moved. Also, this program processes THOUSANDS of files a day.

So with that said. How can I have an efficient program that can move files without any user interaction, and guarantee that all the files get moved? the programs that create these files eventually give up their hold on them. They get created by FTP, Biztalk and other various services.

A: 

you could try using ThreadPool.QueueUserWorkItem to queue up the work, that may keep your threads from getting out of control

David
There's just a single thread here, I don't see the advantage of QueueUserWorkItem?
Sander Rijken
"This though slows the entire program down, so I create a new thread and let it deal with the problem file and move on to the next. The bigger problem is when you have too many of these problem files and the program now has so many threads trying to move these files, that it just crashes with some kernel.dll"You specifically said you had many threads?
David
hmm.. I'll look into the QueueUserWorkItem..
Dan
+2  A: 

Windows is not Unix, so you can't expect to be able to move an open file. If it's in use then moving it is not possible. You can, unless the process opening the file has expressly prohibited this, copy files even though they are in use. I'm not sure if there are any data guarantees to be made for files that have been opened for writing. My best guess is that you have to know whether what you are doing is safe. For instance, reading an log file that is being appended to would be safe, but reading a file that is open for random access would not.

My recommendation would be to make a list of files that you were unable to move, optionally copying those that you can and giving the user an option to manually retry the failed files at some point.

Morten Mertner
I updated my post. there aren't any users, and all the files MUST be moved. None can be left behind.
Dan
A: 

Like @Morten said, you should check out why the move operation fails in the first place, and attempt to detect/notify the user, and do something smarter then just retry it.

Regarding your code:

You shouldn't make the call recursive to start with. If a file stay locked for a long time your stack gets bigger and bigger, with possible StackOverflowException, depending on how long it runs and the actual value of the timeout). Retrying the file should be done inside a loop instead, that errors out when it doesn't succeed after 'N' tries. Something like

Public Sub MoveIt()
    Dim succeeded as Boolean
    succeeded = False

    Dim numberOfTries as Integer
    numberOfTries = 0

    While Not succeeded And numberOfTries < 10 Then
        Try
            File.Move(_FileName, _CopyToFileName)
            succeeded = True
        Catch ex As Exception
            Threading.Thread.Sleep(5000)
            numberOfTries += 1
        End Try
    End While
End Sub

Please note that when doing it this way, it takes 5 * 10 = 50 seconds(!) to find out that a file really cannot be moved, and you still have to consult the user. I don't think there's much point in retrying the code this way.

Sander Rijken
there isn't a user to consult. It's a windows service.
Dan
+2  A: 

There are some number of improvements i can suggest:

  1. Use cycle instead of recursion so not to see the name of this website (Stack overflow exception)
  2. After each attempt you should show user what exactly is wrong. There is a bunch of Exceptions that you could check for: SecurityException, UnauthorizedAccessException, FileNotFoundException, DirectoryNotFoundException and others. Some of these exceptions will run your cycle infinitely in the worst case until user will say stop trying.
  3. If your movement process should run without user interaction, you can create some rules for deciding whether to continue trying. These rules should be based on the type of Exception you get.
Hun1Ahpu
I added an update to my post. I can try to use cycle, there are no users to show, and I MUST move every single file. No file can get left behind.
Dan
you can try to split this into 2 threads: one for copying files and one for deleting them. so you should keep tracking which files have to be deleted and which copied, but this way you can try to delete file not so often.
Hun1Ahpu
+1  A: 

Why not create a queue of files that need to be moved, as your service discovers new files to move (assuming it does some kind of continuous scanning, you didn't mention this part), you can add them to the queue, which could then be processed by a second thread which continually takes the file from the head of the queue and attempts to move it. If the move fails because the file is locked, just re-insert it to the back of the queue and continue. That way you only have 2 threads to worry about, and if the files do get released eventually then all should be well.

I would consider tagging each file with the time it has spent in the queue and number of move attempts made, and once it reaches a certain threshold (e.g. unable to move for 3 hours / 20 attempts) then send an e-mail alert to an appropriate person.

Jon M
hmmm.. I like this idea.. I'll try it out.. and let you know how it goes..
Dan
That worked.. and was the best idea for my situation. Thanks!
Dan
Glad I could help! Might be a nice gesture to leave an upvote for this and the other answers that people took the time to write.
Jon M