views:

92

answers:

3

One of my co-workers is building a C# windows app that needs to grab a set of data and then row-by-row alter that data. If the system encounters a problem at any step in the process, it needs to roll back all of the changes.

The process he has created works well when dealing with smaller sets of data, but as soon as the number of rows get larger, it starts to puke.

The process of altering the data needs to happen in the windows app. What is the best way to handle large data changes atomically in a windows app?

Thank you.

Edit-

We are using a background thread with this process.

I apologize - I don't think I articulated the quandary we're in. We are using transactions right now with the system, I don't know if we're doing it effectively, so I'll definitely review the notes below.

We were thinking that we could spin off additional worker threads to get the work done more quickly, but assumed we would lose our atomic capabilities. Then we were thinking that maybe we could pull all the data into a data table, make changes in that object and then persist the data to the database.

So I was just going to see if someone had a brilliant way to handle this type of situation. Thanks for the comments so far.

A: 

As the problem is not well detailed in this question, I have to presume the "Is it plugged in?" type of question.

If you haven't done so already, potentially long-running operations should be spun off into their own separate BackgroundWorker thread, rather than blocking the UI thread.

Steve Gilham
+1  A: 

I would suggest that you take a look at .NET's transactional model.

Here is some additional reading that may be helpful as well:

Andrew Hare
Indeed, this is right up the alley of transactions.
Matt Hamsmith
A: 

We had a similar situation in dealing with mass updates of a large database. The solution we used was:

  1. Step through each record in database. If it needs updating, create a dummy record with a new primary key.
  2. Update the dummy record from the original, including the originals' PK in the dummy.
  3. When you have created dummy records for all updates, lock the database and replace all foreign keys of the updated records with the primary key of the dummy records. Unlock the database.
  4. Delete all records with the old primary keys.

Note this technique works best if there is only a single reference to the primary key.

Dour High Arch