views:

32

answers:

1

I have a SSIS package with a task to load an out-of-process application to bulk-insert data into one table. The problem is that multiple out-of-process applications will run when multiple files arrive at the same time. This will result in insertion failure.

Can SQL Server Broker Service queue the inserting data? Does SQL Server or SSIS have any mechanism to handle the concurrent reliable insertion?

Thanks.

+1  A: 

It sounds like you're getting the timeout issues because the first run of the SSIS package is locking the table and all other running copies of the package are waiting for the lock to be released.

There are a couple of things you can do to confirm this. First, in SQL Server Management Studio (SSMS), open a query window and when the situation is occurring execute the command EXEC sp_who2. You will see a BlkBy column in the results. The column contains the SPID value that is blocking the selected process. You'll probably see that one instance of your package is blocking all other packages.

In the SSIS designer, in the Data Flow task, edit the Destination component. There's a Table Lock checkbox. It is probably checked, which tells the process to lock the table until the data load is complete.

You have a couple of options to address this. First, is it important that one SSIS package must complete loading data before another one can start? If the answer is No, then you can uncheck the Table Lock option in the Destination component. This will allow SQL Server to manage the simultaneous data loads.

If you must let one package complete before other packages can run, then you may want to create an SSIS task that checks to see if the table is available for loading. If the table is being loaded, then stop the SSIS package and recheck later. You could even handle this in your console app.

SQL Server doesn't have any built-in ways to do this and the Broker Service sounds like it's more work than you need.

bobs
@bobs, Thanks! I have checked it. You are right.
Don