views:

1342

answers:

4

I have a MS SQL table that I don't have any control over and I need to write to. This table has a int primary key that isn't automatically incremented. I can't use stored procs and I would like to use Linq to SQL since it makes other processing very easy.

My current solution is to read the last value, increment it, try to use it, if I get a clash, increment it again and retry.

Something along these lines:

var newEntity = new Log()
{
 ID = dc.Logs.Max(l => l.ID) + 1,
 Note = "Test"  
};

dc.Logs.InsertOnSubmit(newEntity);

const int maxRetries = 10;
int retries = 0;

bool success = false;
while (!success && retries < maxRetries)
{    
 try
 {               
  dc.SubmitChanges();
  success = true;
 }
 catch (SqlException)
 {
  retries++;
  newEntity.ID = dc.Logs.Max(l => l.ID);     
 }
}   
if (retries >= maxRetries)
{
 throw new Exception("Bummer...");
}

Does anyone have a better solution?

EDIT: Thanks to Jon, I simplified the max ID calculation. I was still in SQL thinking mode.

+1  A: 

That looks like an expensive way to get the maximum ID. Have you already tried

var maxId = dc.Logs.Max(s => s.ID);

? Maybe it doesn't work for some reason, but I really hope it does...

(Admittedly it's more than possible that SQL Server optimises this appropriately.)

Other than that, it looks okay (smelly, but necessarily so) to me - but I'm not an expert on the matter...

Jon Skeet
Thanks for the tip. It is smelly, and I really hope someone will point me in the non-smelly direction. :)
David Vidmar
A: 

You didn't indicate whether your app is the only one inserting into the table. If it is, then I'd fetch the max value once right after the start of the app/webapp and use Interlocked.Increment on it every time you need next ID (or simple addition if possible race conditions can be ruled out).

liggett78
My app will not be the only one inserting, of course. I'm integrating with another app through this DB.
David Vidmar
A: 

You could put the entire operation in a transaction, using a TransactionScope class, like below:

using  (TransactionScope scope = new TransactionScope()){
   var maxId = dc.Logs.Max(s => s.ID); 
   var newEntity = new Log(){        
       ID = maxId,        
       Note = "Test"           
   };
   dc.Logs.InsertOnSubmit(newEntity);
   dc.SubmitChanges();
   scope.Complete();
}

By putting both the retrieval of the maximum ID and the insertion of the new records within the same transaction, you should be able to pull off an insert without having to retry in your manner.

One problem you might face with this method will be transaction deadlocks, especially if the table is heavily used. Do test it out to see if you require additional error-handling.

P.S. I included Jon Skeet's code to get the max ID in my code, because I'm pretty sure it will work correctly. :)

alextansc
Your assumption is dangerous. The code will work sometimes, sometimes not and you'll have hard time to debug such things.With the default ReadCommitted trans.level nothing prevents another thread from reading the max value after the 1st thread read it - but before the 1st thread updated it!
liggett78
@liggett78: Can you provide a link on how the TransactionScope class using ReadCommitted as the default transaction level? Because as far as I know, the default transaction level should be Serializable. Of course, your comment might still hold water even at this level, but I need to be sure.
alextansc
A: 

my suggestion, Make the id field auto incrementing and let the server handle id generation. otherwise you will run into the problem liggett78 said, nothing prevents another thread from reading the same id inbetween the reading and submiting of max id for this thread.

P P Vilsad