views:

139

answers:

4

I have an application which intensively uses DB (SQL Server).

As it must have high performance I would like to know the fastest way to insert record into DB.Fastest from the standpoint of execution time.

What should I use ?

As I know the fastest way is to create stored procedure and to call it from code (ADO.NET). Please let me know is there any better way or may be there are is some other practices to increase performance.

+2  A: 

a bulk insert would be the fastest since it is minimally logged, perhaps you can use the SqlBulkCopy class

SQLMenace
I think he does not understand just how fast SQL server is. He's asking about fastest inserts, then using ADO.NET and sprocs. I think he's just worried about such as how to best open connections and such. I say we need more requirements.
drachenstern
+2  A: 

"It depends".

  • How many rows are you talking about inserting?
  • How frequently will they be inserted?
  • What other database operations will be taking place at the same time?
  • Will the rows be inserted because of user action (clicking a button), or because of some external stimulus?

Based on your update, I think you should consider mechanisms other than simple code. Look into SQL Server Integration Services, which are optimized for bulk database operations. It's possible that what you need is a simple SSIS job that runs periodically to do a bulk insert on all "new" data meeting particular criteria. It would allow modification over time to use things like staging tables or intermediate servers if that should prove necessary.

John Saunders
It will be approximately 2mln daily, but it will increase for coming year. I am developing gateway so the rows will be inserted on external system actions. The table mainly will be used for storing transactions.
Incognito
@Incognito: another question would be - do the inserts need to occur in real-time. In particular, how much of a delay can you accept between the time of the request for a row to be inserted and the actual insertion?
John Saunders
We expect them to be real-time, although 5-10min delay can be acceptable.
Incognito
Of course we can cache some requests and run them in batch, but in this case there will be a risk of losing data in case of the service crash for some unexpected reasons.
Incognito
Sorry John can you give a little more details on SSIS. As I know it is very similar to what we know before as DTS. You mentioned that it can be a simple SSIS job which will do a bulk insert, but from where it will get an input. For SSIS in order to get data for bulk inserts this data should be stored first. Please clarify a little.
Incognito
Actually, if you need near-realtime, then SSIS may not be appropriate.
John Saunders
Ah probably you meant to store data into files then load them using SSIS. It would be a solution also if we can somehow avoid file write lockings and SSIS job will run each 10min or so.
Incognito
@Incognito: files, or temporary database tables, yes. Also, if you haven't used SSIS since DTS, forget everything you know about DTS. SSIS is about 1000 times better.
John Saunders
OK I will check it. Although for this solution as I understand it will not work due to file problem I have mentioned.
Incognito
Sorry just want to summarize, what approach is "overall winner". Should I stay on with stored procedures ? Taking into consideration the possibility of caching requests into one transaction in case of huge load. I believe this is subjective, but from your experience...
Incognito
@Incognito: my experience isn't that deep. I only call myself "intermediate" at SQL Server, and then only on even-numbered days. ;-)
John Saunders
Appreciate your modesty. We will go with SP and will run test for other approach on lab to see the practical results. Thank you.
Incognito
A: 

Please let me know is there any better way or may be there are is some other practices to increase performance.

  • Do not open one connection per record. Do learn how connection pooling generally stops you from inadvertently opening one connection per record.
  • If possible, do not open one transaction per record. Also do not leave the transaction open for undue periods of time.
  • Consider table design: narrow tables with few indexes/constraints and no triggers.
  • If you need a fast insert because you're a web application and need to return a page to the user NOW or you're a winform app and are blocking on the UI thread, consider performing the insert async or on another thread.
  • If you need a fast insert to import a million line file, consider doing a bulk insert.
  • If all you want to do is store the data, and not to query it... consider using a file-based solution instead.
David B
We are using connection pool.Unfortunately transaction must be opened for each call as once the response sent to requester it must be guarantied that record is inserted.There are minimal indexes and no transactions on table.It is a tcp server and inserts are called in threadpool.Bulk insert will not work as insert must be done on each request.What you can suggest as a file based solution. Usually we avoid to store data in file due to blocking issues. For each insertion into file it is locked.
Incognito
A: 

Have you done the math? 2M/day = 83k/hour = 1388/min = 23/second.

At 23 inserts per second SQL Server won't break a sweat.

egrunin
That's assuming they come in evenly throughout the day... and I think we can assume they don't.
Joe Philllips
Of course, but even at 10x that rate it's no problem. (I know I'm making some assumptions about his application here.)
egrunin
Yes I believe SQL Server can handle that. But consider that they will be peak hours, the service load will increase during time... so we want to design it for high load in order to avoid future modifications.
Incognito