views:

273

answers:

3

I have a raw file that contains 2m rows ; an ID and a text column

What I want to do is use this raw file and update a table on our live database. The problem I have is I want this to happen in batches/slowly as I dont want to impact live too much whilst it is doing this.

The process will need to open the raw file using a Dataflow task and then for each row update the table one row at a time matching on the id . Can I get a dataflow task to pause in some way ?

A: 

This might help:

http://toddmcdermid.blogspot.com/2009/07/pause-task-for-ssis.html

Cheers, Raj

Raj
Thats custom component :(
Faiz
Thanks but I need to pause within a DFT
Coolcoder
+1  A: 

I think putting a Script (transformation) task in data flow between the text source and the OLE DB destination, that put the thread to sleep say at 5 minutes from the time the execution started, might help.

Faiz
A: 

HAve you tested the speed of just doing the import and load tested the effect on multiple users while it is going on? I've done imports of much larger and more complicated record sets than that in DTS (which is slower than SSIS) and not affected the users at all. SSIS imports tend to be really fast and the data structure you described should not require much time at all. And of course you could schedule durning the low usage times.

HLGEM
^^ I agree. SSIS has a much more efficient engine to extract data from Flat file sources. You could just create the package and then schedule it to run when CPU is idle.
Raj