views:

96

answers:

2

I'm using the MySql connector for .NET to copy data from MySql servers to SQL Server 2008.

Has anyone experienced better performance using one of the following, versus the other?

  • DataAdapter and calling Fill to a DataTable in chunks of 500
  • DataReader.Read to a DataTable in a loop of 500

I am then using SqlBulkCopy to load the 500 DataTable rows, then continue looping until the MySql record set is completely transferred.

I am primarily concerned with using a reasonable amount of memory and completing in a short amount of time.

Any help would be appreciated!

+1  A: 

I have not used SqlBulkCopy, but as a general rule of thumb, DataReaders typically offer better performance.

A DataReader can be processed while the underlying query is still returning records (so you don't have to wait for the query to finish before you can start processing data). A DataReader will return data as soon as it is available and I believe will only store the active record in memory by default (not the complete result set), thus reducing memory usage.

A DataAdapter loads the full result set in to a DataTable/DataSet that will have higher overhead due to how the information is stored in memory and the additional associated state (think rowstate, etc).

If I am only reading data, I will always use a DataReader over a DataAdapter... someone please correct me if I am wrong on any point?

Regardless, SqlBulkCopy appears to only iterate over the records and does not use the DataTable for any optimizations (according to Reflector), so I think DataReader is your best bet.

Chris
+2  A: 

I've used SqlBulkCopy with DataReader when processing large amounts of data. I’ve found the process to be quite efficient in terms of speed and memory usage, since the entire data set is not retrieved before copying. I recommend setting the BatchSize property to some reasonable value, say 500 in your case.

Garett