Application 1 -
- Opens a SqlConnection and a SqlTransaction against a SQLServer 2005 database
- Inserts a record into Table1
- Does not commit or roll-back the SqlTransaction - intentially keeping this alive to demonstrate / describe the problem
Application 2 -
- Opens SqlConnection and a SqlTransaction against a SqlServer 2005 database
- Tries to run this query - "SELECT COUNT(Id) FROM Table1"
Table1 - Id is an Identity field. Name is a varchar field. No other fields in the table
Application 2 is unable to run the "SELECT ..." query. It seems that Table1 is locked or blocked by the insert done in Application 1.
Though the scenario mentioned above is fictional - it demonstrates the problem that we are facing adequately. We want to be able to open a long running SqlTransaction (maybe hours) and do many inserts/updates via that SqlTransaction.
We are developing a data conversion application which has to do a lot of processing on a lot of data before it could be inserted/updated into the database. The data conversion is to happen while we have our main WebForms based application running against the same SQLServer 2005 database in which we want to perform the long running transaction.
All the tables in our application are segmented by a ClientID field from a ClientMaster table. For example if we have a CollegeMaster table, then it would have a ClientID field as a part of the primary key and a ID field for its own identification. The data conversion starts by creating a new ClientID and that new ClientID field is used in all other tables.
Ideally all queries like the one mentioned in Application 2 should not be affected by the long running transaction. Those queries should only read / use data that is already commited and continue to work rather than get blocked due to the long running transaction. What can Application 1 do to ensure that this is achieved?