views:

55

answers:

2

I have half a million records in a data set of which 50,000 are updated. Now I need to commit the updated records back to the SQL Server 2005 Database.

What is the best and efficient way to do this considering the fact that such updates could be frequent (though concurrency is not an issue but performance is)

+4  A: 

I would use a Batch Update.

Also documented here.

David Stratton
+1 - @MSIL - I use this approach on a larger scale (100s of thousands or millions of rows) and it works well for me.
AdaTheDev
Excellent - this is indeed what I needed. Many Thanks Dave
MSIL
+2  A: 

I agree with David's answer, as that's what I use. However, there is an alternative approach you could take which is worth considering (all situations are different after all) - it's something I would consider in the future if I had another similar requirement.

You could bulk insert the updated records into a new table in the DB (using SqlBulkCopy) which is an extremely fast way of loading data into the db (example). Then run an UPDATE statement on your main table to pull in the updated values from this new table which you would drop at the end.

The batched update approach of using SqlDataAdapter allows you to easily deal with any errors on specific rows (e.g. you could tell it to continue in the event of an error with a specific updated row so it doesn't stop the whole process).

AdaTheDev
+1. That's a method I've used successfully as well... It's not as quick and easy as a Batch Update, but it works well and it's not too much more work.
David Stratton
Would say this is the second best approach in case I find the data is growing to an extent where performance becomes a big problem. Thanks a ton!!
MSIL