views:

769

answers:

4

I've never been completely happy with the way exception handling works, there's a lot exceptions and try/catch brings to the table (stack unwinding, etc.), but it seems to break a lot of the OO model in the process.

Anyway, here's the problem:

Let's say you have some class which wraps or includes networked file IO operations (e.g. reading and writing to some file at some particular UNC path somewhere). For various reasons you don't want those IO operations to fail, so if you detect that they fail you retry them and you keep retrying them until they succeed or you reach a timeout. I already have a convenient RetryTimer class which I can instantiate and use to sleep the current thread between retries and determine when the timeout period has elapsed, etc.

The problem is that you have a bunch of IO operations in several methods of this class, and you need to wrap each of them in try-catch / retry logic.

Here's an example code snippet:

RetryTimer fileIORetryTimer = new RetryTimer(TimeSpan.FromHours(10));
bool success = false;
while (!success)
{
try
{
// do some file IO which may succeed or fail
success = true;
}
catch (IOException e)
{
if (fileIORetryTimer.HasExceededRetryTimeout)
{
throw e;
}
fileIORetryTimer.SleepUntilNextRetry();
}
}

So, how do you avoid duplicating most of this code for every file IO operation throughout the class? My solution was to use anonymous delegate blocks and a single method in the class which executed the delegate block passed to it. This allowed me to do things like this in other methods:

this.RetryFileIO( delegate()
{
// some code block
} );

I like this somewhat, but it leaves a lot to be desired. I'd like to hear how other people would solve this sort of problem.

+2  A: 

Just wondering, what do you feel your method leaves to be desired? You could replace the anonymous delegate with a.. named? delegate, something like

    public delegate void IoOperation(params string[] parameters);

public void FileDeleteOperation(params string[] fileName)
{
File.Delete(fileName[0]);
}

public void FileCopyOperation(params string[] fileNames)
{
File.Copy(fileNames[0], fileNames[1]);
}

public void RetryFileIO(IoOperation operation, params string[] parameters)
{
RetryTimer fileIORetryTimer = new RetryTimer(TimeSpan.FromHours(10));
bool success = false;
while (!success)
{
try
{
operation(parameters);
success = true;
}
catch (IOException e)
{
if (fileIORetryTimer.HasExceededRetryTimeout)
{
throw;
}
fileIORetryTimer.SleepUntilNextRetry();
}
}
}

public void Foo()
{
this.RetryFileIO(FileDeleteOperation, "L:\file.to.delete" );
this.RetryFileIO(FileCopyOperation, "L:\file.to.copy.source", "L:\file.to.copy.destination" );
}
Chris Marasti-Georg
+4  A: 

This looks like an excellent opportunity to have a look at Aspect Oriented Programming. Here is a good article on AOP in .NET. The general idea is that you'd extract the cross-functional concern (i.e. Retry for x hours) into a separate class and then you'd annotate any methods that need to modify their behaviour in that way. Here's how it might look (with a nice extension method on Int32)

                    [RetryFor( 10.Hours() )]
public void DeleteArchive()
{
  //.. code to just delete the archive
}

                  
Wolfbyte
+1  A: 

You could also use a more OO approach:

  • Create a base class that does the error handling and calls an abstract method to perform the concrete work. (Template Method pattern)
  • Create concrete classes for each operation.

This has the advantage of naming each type of operation you perform and gives you a Command pattern - operations have been represented as objects.

Andrew Peters
+1  A: 

Here's what I did recently. It has probably been done elsewhere better, but it seems pretty clean and reusable.

I have a utility method that looks like this:

    public delegate void WorkMethod();

    static public void DoAndRetry(WorkMethod wm, int maxRetries)
    {
        int curRetries = 0;
        do
        {
            try
            {
                wm.Invoke();
                return;
            }
            catch (Exception e)
            {
                curRetries++;
                if (curRetries > maxRetries)
                {
                    throw new Exception("Maximum retries reached", e);
                }
            }
        } while (true);
    }

Then in my application, I use c#'s Lamda expression syntax to keep things tidy:

Utility.DoAndRetry( () => ie.GoTo(url), 5);

This calls my method and retries up to 5 times. At the fifth attempt, the original exception is rethrown inside of a retry exception.

Andrej Kyselica
But why the custom `WorkMethod` delegate instead of `Action`?
Dan Tao