views:

38

answers:

2

Hi,

I have identified a problem within my application; basically, one sub-routine prepares (lots) of data that is later on inserted into my local database via a LINQ-to-SQL data context. However, even a relatively modest amount of new data (100,000-ish) takes a tremendous amount of time to be saved into the database when SubmitChanges() is called. Most of the time, however, it is more likely that the application has to save around 200,000 to 300,000 rows.

According to SQL Server's profiler, all generated queries look like the one below, and there's one for each item the application inserts.

exec sp_executesql N'INSERT INTO [dbo].[AdjectivesExpanded]([Adjective], [Genus], [Casus], [SingularOrPlural], [Kind], [Form])
VALUES (@p0, @p1, @p2, @p3, @p4, @p5)

SELECT CONVERT(BigInt,SCOPE_IDENTITY()) AS [value]',N'@p0 bigint,@p1 char(1),@p2 tinyint,@p3 bit,@p4 tinyint,@p5 nvarchar(4000)',@p0=2777,@p1='n',@p2=4,@p3=0,@p4=3,@p5=N'neugeborener'

Does anyone have an idea how to increase the performance of mass inserts with LINQ-to-SQL data contexts, ideally without getting rid of the stronlgy-typed DataContext and falling back to hand-written queries per se? Plus, there's little opportunity or room to tune the underlying database. If anything at all, I could disable integrity constraints, if it helps.

A: 

Are you doing something like this:

foreach (var adjective in adjectives) {
    dataContext.AdjectivesExpanding.InsertOnSubmit(adjective)
    dataContext.SubmitChanges();
}

Or:

foreach (var adjective in adjectives) {
    dataContext.AdjectivesExpanding.InsertOnSubmit(adjective);
}
dataContext.SubmitChanges();

If it is similar to the first, I would recommend changing it to something like the second. Each call to SubmitChanges is a look through all the tracked objects to see what has changed.

Either way, I'm not convinced that inserting that volume of items is a good idea for Linq-to-Sql because it has to generate and evaluate the SQL each time.

Could you script a stored procedure and add as a DataContext method for the designer?

Matthew Abbott
Almost all code uses something along the lines of foreach (var item in items) { /* ... */ dataContext.Item.InsertAllOnSubmit(...); }But you are probably right concerning "mass inserts with LINQ-to-SQL is bad." If all fails, I'll probably have to fall back to insert them using BULK INSERT or manually created INSERT statements somehow. :-(
Manny
By the by, thank you for your stored procedure hint. It did drive me into the direction of stored procedures in combination with OPENXML, which was helpful and a very feasible solution to the problem:http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx
Manny
+1  A: 

ORM is usually not a good idea for mass operations. I'd recommend an old fashioned bulk insert to get the best performance.

Matteo Mosca