views:

513

answers:

4

Problem: How to most efficiently move data from an array into a SQL Server table.

Details: I have created an array with many rows (typically, about 100,000) and many columns (about 40) in memory in a WinForms application. I need to get this array into a corresponding SQL Server table in the fastest way possible. Right now, I am creating a SqlCommand object, looping through the 100,000 rows in my array, and for each row, assigning the 40 parameters of the command object, then calling ExecuteCommand. It works, but is slow, and surely must not be the most efficient way to do this. Should I be putting all the data from the array into a DataTable, and then somehow sending the data table all at once (I don't know how to do that)? Or some other technique? Write out to a file and use bcp (seems like that wouldn't be any faster, I have not tried it). Any suggestions appreciated!

+4  A: 

SqlBulkCopy. It would be even better if you could store the stuff in memory as a DataTable because one of the overloads of the WriteToServer() method takes one.

EDIT: Here's an example of how to use the API.

Robert C. Barth
Thanks, that is EXACTLY what I was looking for!
Rob3C
+1  A: 

We use xml for large data batches. Pass an xml string into a stored procedure and have the sp break out the xml into a table. Then select into your database from the temp table. Seems to work pretty well for us.

Edit: As noted in the comments below, the procedure that transforms the xml into a table is sp_xml_preparedocument.

declare @idoc int
exec sp_xml_preparedocument @idoc output, @input_xml
-- select into a tmp table from openxml(@idoc, 'xpath_to_your_data')
exec sp_xml_removedocument @idoc
Wes P
You know SQL Server can directly read the XML into a table, right? There's no need to have a stored procedure that parses the XML. Look at sp_xml_preparedocument.
Robert C. Barth
sp_xml_preparedocument is what we use. But I thought that the 'sp' prepended on that stood for stored procedure, therefore suggesting that it is indeed a stored procedure. I just couldn't remember the name right off hand. Thanks for the clarification.
Wes P
A: 

Pinal Dave had an odd method of performing a bulk insert that I read about just the other day:

INSERT INTO MyTable  (FirstCol, SecondCol)
    SELECT  'First' ,1
    UNION ALL
SELECT  'Second' ,2
    UNION ALL
SELECT  'Third' ,3
...

Considering your large data set, might try it just to see what happens.

DavGarcia
This is OK for a smaller number of rows, but a bulk load would be faster for 100,000 rows.
ConcernedOfTunbridgeWells
A: 
  • Display data in a datatable in a winforms app.
  • Copy data into Excel (or OpenOffice Calc).
  • Save file.
  • Use Data Import tool to pull data in and map fields.

You'd be done inside of 15 minutes.

Edit: ugh, this might not work well for 100k rows. But it's quick, easy and worth a shot before you go to something more complex.

jcollum