views:

584

answers:

1

We have an in memory DataTable in a CLR procedure. After significant processing the DataTable has a lot of data that we need to load into a table in the database. Unfortunately, since we are using a context connection SqlBulkCopy will not work (throws error: The requested operation is not available on the context connection outline of context restrictions).

We are thinking of building up an additional regular connection and using it to perform the SQLBulkCopy. This seems rather silly given the overhead of using a regular connection over a context connection. We've thought about iterating over the rows in the DataTable and inserting a record for each but it is of huge size and SQLBulkCopy seems like the better fit.

We are not married to the DataTable and are just looking for the best method of inserting a huge amount of data into a database from a CLR procedure. A final note on the DataTable it is representative of an underlying database table but is not tied to the table through a DataSet or TableAdapter. The means of doing so in a CLR procedure were not immediately obvious. Doing so and using the AcceptChanges or Update method would be an acceptable solution.

+1  A: 

I'd estimate SQLBulkCopy to be 10-100 times faster than doing inserts depending on table size, so the trade off is well worth it.

Seeing as you're doing 'significant processing' on your DataTable, which no doubt takes time and resources, the tiny additional overhead of either using a full connection instead of a context connection, or opening a completely new connection is a small price to pay.

MrTelly
Really the DataTable is just acting as a convenient object for storing records after significant processing. That said the ability to send a raw DataTable to SqlBulkCopy for record insertion seemed to be a slam dunk until we ran into the problem with our connection type. The more I think on this the more I tend to agree that opening a regular connection is probably the way to go.
ahsteele