views:

94

answers:

2

I'm using .Net 2.0, ADO.NET, a dataset and data adapters to manage my data, and Microsoft SQL Server 2005 as database manager.

I have an application that generates a great number of results (500K+) and saves them in a database. If one result generation fails, I would like to not save any of the results. So I put all the database inserts within a transaction that gets rolled back if a result cannot be generated.

However the table gets locked down until all the results are written, which is not what I intended. Or at least I guess it is locked down, no SELECT query responds until the insertion ends.

Is there a way to make the the transaction to not lock the table while a transaction is active?

Is it even a good idea to use a transaction with so many inserts being made?

+2  A: 

You might want consider splitting the transaction into smaller batches, committing after every 1k or so.

You can get past the lock by specifying the WITH(NOLOCK) hint in your select statement (put it just after the table name), but be very careful as it will read data that has not been committed (i.e. something that might be rolled back). There is also the WITH(ROWLOCK) hint that can be used with your insert, which may reduce the chance of locking the select (a lock will still occur if you try to access a row that has been inserted by an uncommitted transaction, but SQL will not take out locks on the entire table so it is less likely).

You can find more information on table hints here, just be very careful with them as they can affect performance. Hope it helps.

Rory
Do I have to manually change the inserts in the dataset, or is there a flag I can set somewhere for the dataset to change the inserts for me?
Wilhelm
I am not aware of a flag, you would probably need to modify the query when going this route. I would suggest looking at the answer proposed by RocketSurgeon.
Rory
+2  A: 

You can use temporary staging table. Fill it at your pace. Then execute final move with one SELECT INTO as single transaction. After big move is done, drop the staging table.

If there are some stronger restrictions about overall integrity, you may introduce more of lightweight tables, tables reflecting stages of data flow, journal/log tables. Some recovery/compensate/cleanup routines, etc.

RocketSurgeon
I see a problem in this, as the dba would have to grant creating table priviliges in the database for the users.
Wilhelm