views:

34

answers:

1

I need to constantly merge (upsert/delete) data from an ODBC data source to a SQL Server 2008 database (number of rows vary from one row to 100000 of rows)

What would you recommend as the most efficient approach (using .net 3.5 ):

  1. Use SqlBulkCopy into temp table then call stored procedure with Merge command using temp table as source.
  2. Calling a Stored procedure that has a table value parameter, where data is sent as a parameter (SqlDbType.Structured), table parameter used as source of merge command. Is the data sent via table parameter sent to server in a bulk operation? Is it possible and efficient to use it in cases where there are > 1000 rows?
  3. Call stored procedure with merge command that uses OpenRowset bulk to get the data from the ODBC source (use linked server?)
  4. Any other way.

Thank you!

A: 

This kind of depends on what the trigger is for the import. If it's going to be scheduled every 10 minutes or something I would use either option 1 or create an SSIS package that will do the same thing as option 1. JNK is correct that it would be better to use a permanent table, it will avoid allocated issues and it will allow you to pick up the process from the middle if needed.

In SSIS your workflow would look like this:

  • truncate staging table
  • create data flow task to import data from source to staging table, handle errors if needed
  • call stored procedure to merge data

Once the package is created you can schedule it through the SQL Agent. Make sure to create the package with the DontSaveSensitiveData option set on the package properties so you don't get strange encryption errors.

Jason Cumberland