Hi,
I have identified a problem within my application; basically, one sub-routine prepares (lots) of data that is later on inserted into my local database via a LINQ-to-SQL data context. However, even a relatively modest amount of new data (100,000-ish) takes a tremendous amount of time to be saved into the database when SubmitChanges()
is called. Most of the time, however, it is more likely that the application has to save around 200,000 to 300,000 rows.
According to SQL Server's profiler, all generated queries look like the one below, and there's one for each item the application inserts.
exec sp_executesql N'INSERT INTO [dbo].[AdjectivesExpanded]([Adjective], [Genus], [Casus], [SingularOrPlural], [Kind], [Form])
VALUES (@p0, @p1, @p2, @p3, @p4, @p5)
SELECT CONVERT(BigInt,SCOPE_IDENTITY()) AS [value]',N'@p0 bigint,@p1 char(1),@p2 tinyint,@p3 bit,@p4 tinyint,@p5 nvarchar(4000)',@p0=2777,@p1='n',@p2=4,@p3=0,@p4=3,@p5=N'neugeborener'
Does anyone have an idea how to increase the performance of mass inserts with LINQ-to-SQL data contexts, ideally without getting rid of the stronlgy-typed DataContext and falling back to hand-written queries per se? Plus, there's little opportunity or room to tune the underlying database. If anything at all, I could disable integrity constraints, if it helps.