views:

149

answers:

4

my application has the following procedure.

a database of products (20,000 rows) exists.

our client has 'import' feature where he imports an excel file.

this is implemented by deleting all products table rows, then doing the import - which is a long thing since we programmatically performing calculations on the data.

the obvious problem is if the 'import' action fails (IO stuff), they now have none/partial/curropt data in the products table.

We wish that if the 'import' operation fails, the original data remains.

this is ASP.NET application, written in C#, using SQL Server 2005 and using XSD which we created through the VS2005 design tools.

+1  A: 

I would import the data onto a table with the same structure as your products table, and then replace the data on your products table once you're happy the import has been successful.

This will mean that users can carry on using the system while the import is underway, as well as minimizing the down time, while you update the products table.

Bravax
+4  A: 
  1. Start transaction
  2. delete data from table
  3. Insert new data
  4. if no problem happens, commit changes
Akshay
+1  A: 

Using a transaction would be the obvious choice here.

I guess the first thing I would ask is do you really need to clear the entire table? Is there a timestamp or something you could use to limit the amount of data that needs to be refreshed? Could you re-work the logic to use updates instead of all the deletes and inserts? That way your transactions will be smaller.

-Dan

Dan
+1  A: 

I would go with the transaction approach as outlined above. But the only problem I can see is that you might end up locking the whole table for the entire period the import process is taking place. you might need to think about it. The seperate table approach can be one of the solutions.