views:

292

answers:

1

I'm iterating over a set of instantiations of a LINQ-to-SQL class.

During this iteration, I'm making changes, flagging some for deletion and even inserting new ones in the datacontext. When I'm done iterating, I fire context.SubmitChanges() - it works, but it's amazingly slow. I'm running an express version of MSSQL 2008 locally.

Also, I've noticed that the UPDATE statement it generates, for the rows that need updating, it says UPDATE SET ..... WHERE @p1 = ... AND @p2 = ... that is, on all the fields/columns. I've defined the primary key in the table, and also in the DBML schema, but it still seems to compare on all columns for the UPDATE statement instead of just the primary key.

Any ideas?


Update:

As I feared, it was because I had forgotten to update the LINQ-to-SQL schema after defining the primary key and the necessary indexes on the table.

The volume in question is around 40k rows, and I'm quite happy with the performance of my solution.

If the volume should rise, I'll be looking into the concurrency checking properties Marc referred to.

+1  A: 

The WHERE @p1 = ... AND @p2 = is linked to the concurrency checking. You can disable this per-column, or (better) use a timestamp/rowversion instead (yay!). Look at the "Update Check" and "Time Stamp" properties in the dbml designer. Adding a rowversion to the table and re-importing the tables into the dbml would be my preferred option.

What sort of volume are we talking here? hundreds? thousands? more?

LINQ-to-SQL (and EF) make changes on an OO-basis, hence row-by-row. There is a limit beyond which set-based updates are the way to go, and you need to write a stored procedure that does all the work in one hit.

If you aren't making massive changes but it still takes a long time, I would expect the problem to be lack of appropriate indexing on the primary key(s).

Marc Gravell
Oh, just saw your comment! Sorry and thanks for the hint.
Anders