I'm inserting large number of records using LinqToSql from C# to SqlServer 2008 express DB. It looks like the insertion is very slow in this. Following is the code snippet.
public void InsertData(int id)
{
MyDataContext dc = new MyDataContext();
List<Item> result = GetItems(id);
foreach (var item in result)
{
DbItem dbItem = new DbItem(){ItemNo = item.No, ItemName=item.Name};
dc.Items.InsertOnSubmit();
}
dc.SubmitChanges();
}
Am I doing anything wrong? Or using Linq to insert large number of records is a bad choice?
Update: Thanks for all the answers. @p.campbell: Sorry for the records count, it was a typo, actually it is around 100000. Records also range till 200k as well.
As per all the suggestions I moved this operation into parts (also a requirement change and design decision) and retrieving data in small chunks and inserting them into database as and when it comes. I've put this InsertData() method in thread operation and now using SmartThreadPool for creating a pool of 25 threads to do the same operation. In this scenario I'm inserting at a time only 100 records. Now, when I tried this with Linq or sql query it didn't make any difference in terms of time taken.
As per my requirement this operation is scheduled to run every hour and fetches records for around 4k-6k users. So, now I'm pooling every user data (retrieving and inserting into DB) operation as one task and assigned to one thread. Now this entire process takes around 45 minutes for around 250k records.
Is there any better way to do this kind of task? Or can anyone suggest me how can I improve this process?