views:

156

answers:

1

Hi All,

I'm looking for any advice on what's the optimum way of inserting a large number of records into a database (SQL2000 onwards) based upon a collection of objects.

Currently my code looks something similar to the snippet below and each record is inserted using a single sql simple INSERT command (opening and closing the database connection each time the function is called! - I'm sure this must slow things down?).

The routine needs to be able to cope with routinely inserting up to 100,000 records and I was wondering if there is a faster way (I'm sure there must be???). I've seen a few posts mentioning using xml based data and bulk copy routines - is this something I should consider or can anyone provide any simple examples which I could build upon?

foreach (DictionaryEntry de in objectList)
{
  eRecord record = (eRecord)de.Value;

  if (!record.Deleted)
  {
    createDBRecord(record.Id,               
                   record.Index,
                   record.Name,
                   record.Value);
  }
}

Thanks for any advice,

Paul.

+1  A: 

Doing that way will be relatively slow. You need to consider a bulk INSERT technique either using BCP or BULK INSERT, or if you are using .NET 2.0 you can use the SqlBulkCopy class. Here's an example of using SqlBulkCopy: SqlBulkCopy - Copy Table Data Between SQL Servers at High Speeds

Here's a simple example for SQL Server 2000:

If you have a CSV file, csvtest.txt, in the following format:

1,John,Smith

2,Bob,Hope

3,Kate,Curie

4,Peter,Green

This SQL script will load the contents of the csv file into a database table (If any row contains errors it will be not inserted but other rows will be):

USE myDB
GO

CREATE TABLE CSVTest
(
    ID INT,
    FirstName VARCHAR(60),
    LastName VARCHAR(60)
)
GO

BULK INSERT 
CSVTest
FROM 'c:\temp\csvtest.txt'
WITH
(
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n'
)
GO


SELECT *
FROM CSVTest
GO

You could write out the dictionary contents into a CSV file and then BULK INERT that file.

See also: Using bcp and BULK INSERT

If you implement your own IDataReader method, you could avoid writing to an imtermediate file. See ADO.NET 2.0 Tutorial : SqlBulkCopy Revisited for Transferring Data at High Speeds

Related SO question: how to pass variables like arrays / datatable to SQL server?

Mitch Wheat
Thanks for the answer and yes I'm using .NET 2.0 so I'll look into the class you mentioned as I'd like to make the operation directly on the database if possible.Do you have any experience of the SqlBulkCopy class?Many thanks,Paul
Paul
@Paul: Sure do. Done correctly the SqlBulkCopy is very fast. I've helped teams achieve 3 orders of magnitude with it.
Mitch Wheat
Thanks Mitch - I'm off to start reading and learning!Cheers
Paul