tags:

views:

95

answers:

2

I have a Store Procedure being called to fill one table whenever we receive incoming files. Sometimes we receive more than one file and the procedure will be called simultaneously. Inside the Procedure the statements are quite simple as given below

IF NOT EXISTS (SELECT.... WHERE A=1 B=2) INSERT ... ELSE UPDATE ... WHERE A=1 and B=2 END

doing this I started getting a duplicate records error, I assume that 2 same records tried to INSERT. To avoid this I put these staements inside a Transaction with SERIALIZABLE ISOLATION LEVEL.

things got even worse and I started getting deadlock error!!!

Is there something wrong I am doing here??

A: 

Try changing the isolation level to Read Committed.

Serializable is a super big transaction hammer.. It will prevent other processes from reading data that has been modified but not committed. Combined with your two-set process in the sp, you will definitely get deadlocks.

Check this out for details: http://msdn.microsoft.com/en-us/library/ms173763.aspx

Daniel M
I tried with Read Committed which I suppose is the default level. But I get the duplicate record error still.
wixuser1105
Give read uncommitted a go. Although Making the changes in Andrew's answer will definitely sort your problems out.
Daniel M
+1  A: 

If performance becomes an issue by changing the isolation level then you can try to re architect your solution to use a message queue.

As files come:

  1. Call a SP that adds the file to a message queue. This will be non blocking (Use a table with an auto increment PK)
  2. Have a a separate SP running as a scheduled task that takes one messages out of the queue (if there are any) and then process it. - Avoiding deadlocks.
Andrew