tags:

views:

410

answers:

3

I've got about 100 million rows that I'm moving in SSIS 2008 via a Data Flow Task. It's pretty much a straight table data copy using a Multicast. My question is this:

Using the OLE DB Destination Editor I have two options: Rows per batch and Maximum insert commit size. What are good settings for this? I've only been able to find that you are recommended to set Maximum insert commit size to 2147483647 instead of 0, but then tweak both these settings based on testing. I'm curious to find out if anyone has discovered anything useful in their own management of these values.

+4  A: 

There is no best value, it depends greatly on the design of the database, the number of users, the kind of hardware you are operating one etc. THat is why you need to test for yourself with your system.

HLGEM
It's unfortunate there's no rule of thumb, so I will have to measure it.
Nissan Fan
+2  A: 

I find this useful to guide me: Top 10 SQL Server Integration Services Best Practices

Simply because I don't use SSIS enough. However, as HLGEM said, you'll just have to give a shot yourself to see what happens...

gbn
+1  A: 

If you leave the settings that are default, you will have to insert all 100 million rows before the batch is committed which can cause your transaction log to grow very large. If you want to keep that down, pick a number lower than the default.

Josef Richberg

related questions