tags:

views:

552

answers:

1

I have a DataTable which I want to save to a SQLite Database Table. Here is my dilemma, I don't know which way to go. At most the DataTable would contain 65,000 rows and probably 12 columns.

So, would it be faster to save the DataTable to a CSV file and then Bulk Insert it into SQLite (which I have no idea how to do) or would it be faster to loop through all the columns create parameters and then loop through each individual row in the datatable to retrieve the information to insert into the database table.

Is there an even better way than what I have listed?

Thanks, Nathan

A: 

Check this question out.

There is a SqlBulkCopy in the .Net framework class that provides funcitonality for bulk inserts. Unfortunately it is supported only for SQL Server databases.

However tweaking a few parameters on your inserts will make the bulk insert a lot quicker. From what people are reporting there's not that much of a performance hit with single inserts.

Mircea Grelus