views:

47

answers:

1

I have two data sources: a legacy one (web service) and a database one. Now, when I process request, I made changes to both. In case of error, I want to rollback both.

try
{
  legacy.Begin(); db.Begin();
  legacy.MakeChanges(); db.MakeChanges();
}
except (Exception)
{
  legacy.Rollback(); db.Rollback();
}

The problem is, what if legacy throws during Rollback (i.e. network error)? db.Rollback() won't be executed. And vice versa. The solution I see is:

legacy.Begin(); 
try
{
  db.Begin();
  try
  {
      legacy.MakeChanges(); db.MakeChanges();
  }
  except (Exception)
  {
    db.Rollback();
    throw;
  }
}
except (Exception)
{
  legacy.Rollback(); 
  throw;
}

Is it acceptable? Is there a better solution?

+1  A: 

You are effectivly rolling your own Two Phase Commit transactions. A completely robust solution requries co-operation from both resources and usually is best done using a 2PC-capable transaction manager.

The fundamental problem here is that you have no protection from the failure of your own application (he's playing the role of transaction manager) and hence no guarantee that you can Rollback that legacy. Consider a failure of your app just at the point it was about to call legcacy.Rollbock(). You now habe no record that you were in the middle of a "transaction" and so when your app comes back you have no reason to go and execute that Rollback.

Possible approaches:

1). Use true 2PC, this is only possible if your WebSerice has transactional capabilities (technically possible but practicly unlikely).

2). Tolerate the risk of some inconsistency. Many systems actually do this inadvertantly,

3). Attempt the kind of recovery you are going for but accept that it may fail. Add some form of audit trail that permits detection of uncertain outcomes and hence allows manual patch-up later. So you could add some kind audit logging that would allow you to detect mismtached activities. This can be seen as a kind of very manual 2PC approach.

djna
I control the web service and its sources, but to a degree so that legacy compatibility is not broken. However I don't think I may invest into its full 2PC compatibility (because of the schedule); I think we can live with #2 (and fix manually if needed). I'm also very interested in the 2PC concept; can you recommend a good reading on it? Google shows many link, mostly about Java, while I'd prefer general or C#-related reading.
queen3
A starting point in the MS world is here http://msdn.microsoft.com/en-us/library/aa213077%28SQL.80%29.aspx the Web Service capability to participate in a true 2PC is WS Atomic Transaction http://msdn.microsoft.com/en-us/library/ms733943.aspx
djna
Well, thanks for the info, it is useful. But still I think we'll go #3, this is not a mission-critical system, we can fix things manually once in a month (or even week). Implementing transactions manager, etc, is definitely out of budget.
queen3