views:

95

answers:

4

I am inserting record in the database (100,1.000, 10.000 and 100.000) using 2 methods (is a table with no primary key and no index)

  • using a for and inserting one by one
  • using a stored procedure

The times are, of course better using stored procedure. My questions are: 1)if i use a index will the operation go faster and 2)Is there any other way to make the insertion

PS:I am using ibatis as ORM if that makes any difference

+2  A: 

No, I suspect that, if you use an index, it will actually go slower. That's because it has to update the index as well as inserting the data.

If you're reasonably certain that the data won't have duplicate keys, add the index after you've inserted all the rows. That way, it built once rather than being added to and re-balanced on every insert.

That's a function of the DBMS. I know it's true for the one I use frequently (which is not SQLServer).

paxdiablo
+1  A: 

I know this is slightly off-topic, but it's a shame you're not using SQL Server 2008, as there's been a massive improvement in this area with the advent of the MERGE statement and user-defined table types (which allow you to pass-in a 'table' of data to the stored procedure or statement so you can insert/update many records in one go).

For some more information, have a look at http://www.sql-server-helper.com/sql-server-2008/merge-statement-with-table-valued-parameters.aspx

Paul Suart
+4  A: 

Check out SqlBulkCopy.

It's designed for fast insertion of bulk data. I've found it to be fastest when using the TableLock option and setting a BatchSize of around 10,000, but it's best to test the different scenarios with your own data.

You may also find the following useful.

SQLBulkCopy Performance Analysis

Winston Smith
A: 

It was already discussed : Insert data into SQL server with best performance.

Incognito
Please leave this in the comments if you think it's a dup.
the_drow