views:

305

answers:

5

Within SSIS 2005 I used the Import/Export wizard to create a package that drops/recreates and replaces the data on some tables between my production server and development machine. The control flow that was created by the wizard was extremely complicated so I created a new package and used the "Transfer SQL Server Objects Task" which is really easy to configure and setup as opposed to the crazy thing the wizard created. The problem is that the package that I created takes over 3 minutes to run while the wizard version takes about 20 seconds. They are basically doing the same thing, why such a difference in execution time and is there a setting that I can change in the package that is using the Transfer Objects task to make it run quicker?

Here is the package that the wizard created. I have created similiar packages before using the wizard that I had no problem editing, but I never saw anything like this before. I cannot figure out where to modify the tables and schema that I drop and create.alt text

Here is the properties of the transfer task inside that for loop container

alt text

+1  A: 

What connection type are you using?

Here when I've been wanting to transfer between Oracle and SQL, the ADO.NET provider is miles slower than the Oracle OLE DB provider.

James Wiseman
The only connection option I have when using that task is an SMO connection and I am attempting to transfer objects between 2 sql server instances. I have OLEDB connections listed in the connection manager because that is what I thought I could use at first but I couldn't choose them.
Breadtruck
+1  A: 

Why not use the wizard generated package and figure out what it does? It is obviously doing things very efficiently.

Sam
See my updated picture of the package. Tell me where I can dig in and modify the tables I want to drop and re-create, because I cannot find it. The Transfer object task that runs slow is easy to configure but slower than a snail.
Breadtruck
+1  A: 

Could be quite a number of things. Are you doing lookups? If so, use joins instead. You can also run a db profile to see what the crazy package does opposed to your custom package.

RailRhoad
No lookups, I am just running the transfer objects task which drops and recreates a couple of tables and copies over less than 2000 rows for approx 10 tables.
Breadtruck
+1  A: 

I don't use the wizard, but could it have created a stored procedure that will actually do the work? That would explain how it is going faster, since the stored procedure can do all the work within the database.

I am curious what is within TransferTask, as that seems to be where all the work is done.

You could look at exporting the data to a flat file, then using a Bulk Import to do this faster.

For some more thoughts about how fast things go look at here, but most important is some of the comments that were given, such as how he used Bulk Insert wrong.

http://weblogs.sqlteam.com/mladenp/articles/10631.aspx

UPDATE: You may want to also look at this: http://blogs.lessthandot.com/index.php/DataMgmt/DBAdmin/title-12 as, toward the end, he shows how long his tests took, but the first comment may be the most useful part, for speeding your import up.

James Black
I uploaded the properties window for that object. I can see that it is getting a variable "CurrentTableNode" for the property TableMetaDataNodeVariableName, but I can guess comes from the smo_pubs connection or the inner package, but I try and click on the properties of those connections and nothing makes sense to me.
Breadtruck
+1  A: 

This class of performance problem usually stems from "commit" levels and logging.

The illustrated wizard generated task does a "start transaction" before entering the loop and commits after all the data is transferred. Which is the best thing to do if the table is not 'enormous'.

Have you left 'autocommit" on in your hand coded version?

James Anderson
I couldn't find any "autocommit" type options for the Transfer SQL Server Objects Task, nor in the SMO connection managers.
Breadtruck