I am wondering how SSIS deals with transactions with large data sets. I have a 'large' data set of about 150,000 rows, each of which needs to be validated against business rules as part of an ETL from a staging database to the live database.
If any of the records fail their business rules, no records should end up in the live database (i.e. rollback the transaction).
My question is how SSIS handles large transactions - or can it? Will it handle entering 149,999 records then roll the whole lot back if the last record fails its busiless rules? Or is there a better best-practice for performing this type of large data transfer operation?
My current thinking is to process each record within a sequence container at the control flow level, with the transcation settings enabled on the container. All validations will be done within the sequence container, and the insert will also be done in this container.