views:

46

answers:

2

Hi,

I have to insert large amount of data into a table. Does sqlserver 2008(compared to 2005) has any new features to increase the performance in this case ?

+1  A: 

SQL Server 2008 contains the MERGE TSQL statement which can speed up certain types of combined INSERT, UPDATE and DELETE operations.

If you are intending to perform through code, I would suggest using the System.Data.SqlClient.SqlBulkCopy class (but also present in SQL Server 2005).

Mitch Wheat
I am not doing it through code. But is it like SqlBulkCopy fatser than normar database insert ??? If so, how fast it is???
Anish
Check The Data Loading Performance Guide http://bit.ly/9Due9T, SqlBulkCopy Class itself http://bit.ly/8Xxu2R and High performance bulk loading to SQL Server using SqlBulkCopy http://bit.ly/a9KZ2l if you still concerned. But if you do with TSQL then go MERGE
Nick Martyshchenko
+1  A: 

I don't know if this is feasible for your problem, but if you can I would really try to develop this in code.

I had a similar question for a big project in the past, that needed to import 15 years worth of production data into a new schema (in SQL Server 2005.) System.Data.SqlClient.SqlBulkCopy was by far the fastest option.

If you do go this way, I suggest doing inserts in lots of roughly 1 GB at a time, then manually calling the .NET GC to free up your table in memory. I was forced to do both of these things to not run into memory errors (32 bit system, though.)

Edit - Pseudocode for my solutiong was something like:

Table dataToInsert = new Table();
var sqlCommand = new SqlCommand("select * from old database");
DataReader dataFromOldSystem = sqlCommand.ExecuteReader();
foreach (DataRow oldRow in dataFromOldSystem.Tables[0])
{
// I had to modify/transpose the row from the old table in some way
DataRow newRow = new DataRow(oldRow.GetInt(0), oldRow.GetDateTime(1), oldRow.GetInt(2));
dataToInsert.AddRow(newRow);

newRow = new DataRow(oldRow.GetInt(0), oldRow.GetDateTime(1), oldRow.GetInt(3));
dataToInsert.AddRow(newRow);

newRow = new DataRow(oldRow.GetInt(0), oldRow.GetDateTime(1), oldRow.GetInt(4));
dataToInsert.AddRow(newRow);

// check if the number of rows is over some magic number that is below the memory limit
// you can check the private bytes in use by your app to help guess this number
if (dataToInsert.Rows.Count > 1000000)
{
SqlBulkCopy bulkCopier = new BulkCopy(blah);
bulkCopier.Execute();

dataToInsert = null;
GC.Finalize();
GC.Free;

dataToInsert = new Table();
}
}
Coxy
You mean..call SQL bulk Copy in a loop and call GC to free ? If so how can I specify size of bulk copy as 1 GB. Do you have any sample code for same ?
Anish
I'm afraid I've moved jobs since then, so don't have sample code. In my case, I used trial and error and some educated guesses to do it.
Coxy
I've added some pseudocode which is the basic idea behind the app I developed.
Coxy
Thank you very much Coxy
Anish