views:

4196

answers:

9

Edit:
My problem is not a problem anymore: I have redo my performances tests and I have do a fatal stupid error: I had forget a x1000 to get seconds from milliseconds :/ Sorry for that guys.
For info:
- I do some 1900 updates per second from my PC to the DataBase server on local network.
- 3.200 updates per second if the programs is on same machine than DB.
- 3.500 updates per second from my PC on the DataBase server I do not re-create and re-open a new SQLConnection.
- 5.800 updates per second with a batch text. For my 10.000 rows, if it take 5 seconds, it is ok for my programs. Sorry to have worry you.

Actually, I use a SQL stored prodedure to create a row in my database to avoid SQL-injection. In C# I have the following method:

public void InsertUser(string userCode)
{
   using (SqlConnection sqlConnection = new SqlConnection(this.connectionString))
   {
      SqlCommand sqlCommand = new SqlCommand("InsertUser", sqlConnection);
      sqlCommand.CommandType = System.Data.CommandType.StoredProcedure;
      sqlCommand.Parameters.Add(new SqlParameter("@UserCode", userCode));
      sqlConnection.Open();
      sqlCommand.ExecuteNonQuery();///0.2 seconds !ERROR HERE! 0.2ms here,NOT 0.2sec!!!
   }
}

It woks great when i have one or two rows to insert. But if i need to create 1.000 users and 10.000 products and 5000 pets, it is not the best solution: I will loose a huge time in netwok transport.

I believe, without checkin it, that I can use just a limited amount of callback. So I do not want to call 10.000 times:

sqlCommand.BeginExecuteNonQuery()

Another way will be to create a batch text, but there is a SQL-Injection risk (and it is ugly).

Does there is a 'SqlCommandList' object that manage that in .Net? How do I do large writing in database? What the good patern for that?

A: 

Have you considered passing an XML document to a stored procedure, then iterating through that to find the data to insert?

Danimal
+2  A: 

Personally, if I regularly expect to do fairly large inserts (10,000 rows would definitely qualify...), I might consider having a separate table for incoming data, and use SqlBulkCopy to populate this table. Then you just execute a single stored procedure that moves the data over into the real table.

Another approach is to send down xml to the database, and use sqlxml to parse that (much easier with SQL2005 and above) - but this puts extra work on the db server.

Marc Gravell
A: 

I don't think you'll see a big slow down because of network transport. I imagine all that's being sent over is the SP name and the parameter value "EXEC InsertUser @Usercode = 'myUserCode'". ADO.NET already does connection pooling, as I found out, so I'm not sure that using any other method will see a major increase in speed.

Rob Bell
A: 

"it is not the best solution: I will loose a huge time in netwok transport" Can you live with the loss?

If this is something you don't do often, then does it matter? Measure it first, if it's a problem then fix it, personally probably I'd go with Marc Gravells table for incoming inserts. Another option is to fire the inserts asynchronously, then you're not waiting on each to finish before you start the next.

It took me years, but finally I figured out that I shouldn't waste time optimising code that doesn't need it.

Hope this helps (even though I don't think it will, sorry).

Binary Worrier
+6  A: 

This should run a little faster:

public void InsertUser(IEnumerabler<string> userCodes)
{
   using (SqlConnection sqlConnection = new SqlConnection(this.connectionString), 
             SqlCommand sqlCommand = new SqlCommand("InsertUser", sqlConnection))
   {
      sqlCommand.CommandType = System.Data.CommandType.StoredProcedure;
      SqlParameter param = sqlCommand.Parameters.Add("@UserCode", SqlDbTypes.VarChar);
      sqlConnection.Open();

      foreach(string code in userCodes)
      {
          param.Value = code;
          sqlCommand.ExecuteNonQuery();///0.2 seconds
      }
   }
}

That will only open one connection and only create one command, even if you pass it 1000 users. It will still do each insert separately, though. And of course if userCode isn't a string you'll want to re-factor it appropriately. You may also want to look into SQL Server's BULK INSERT command.

Joel Coehoorn
A: 

If you really were concerned about this, you could (like you said) batch the commands up in strings like so:

var cmd = new SqlCommand();
cmd.Connection = sqlConnection;
for (int i = 0; i < batchSize; i++) {
    cmd.CommandText += String.Format("EXEC InsertUser @UserCode{0};", i);
    cmd.Parameters.AddWithValue("@UserCode" + i.ToString(), XXXXX);
    //... etc ...
}

Because in this scheme, you'd be using a parameter, you don't have more risk of SQL injection than if you used a stored proc. But I question whether or not you'll really save an appreciable amount of time doing this. IMO you should just keep it simple and do it the way you are doing it now.

Dave Markle
There is a LOT of overhead in re-creating the command and constantly opening and closing new connections.
Joel Coehoorn
Also, if you're gonna do it like that, use a stringbuilder to assemble the command.
Joel Coehoorn
Joel, I think you'll find that connection pooling speeds things up considerably. And as for StringBuilder, it's true that it will speed things up and cause less memory usage, but it all depends on how many commands you want in your batch...
Dave Markle
A: 

Based off Joel's answer, this is the fastest solution short of using either SqlBulkCopy or creating big strings of messy SQL and executing. (I added a transaction which will improve performance quite a lot)

public void InsertUser(IEnumerabler<string> userCodes)
{
   using (SqlConnection sqlConnection = new SqlConnection(this.connectionString))
   {
      sqlConnection.Open();
      SqlTransaction transaction = connection.BeginTransaction();
      SqlCommand sqlCommand = new SqlCommand("InsertUser", sqlConnection);
      sqlCommand.Transaction = transaction;
      sqlCommand.CommandType = System.Data.CommandType.StoredProcedure;
      SqlParameter param = sqlCommand.Parameters.Add("@UserCode", SqlDbTypes.VarChar);

      foreach(string code in userCodes)
      {
          param.Value = code;
          sqlCommand.ExecuteNonQuery();
      }      
      transaction.Commit();
   }
}
Sam Saffron
Transactions involve locking... if this runs in isolation it will be faster. But if there is other activity on the server it can actually slow things down really bad. Though I'll admit that doing only inserts is unlikely to result in a conflict.
Joel Coehoorn
A: 

As per some of the answers above, the most noticeable performance increase for the least effort involves 2 changes to your existing code:

  1. Wrapping the updates in a transaction
  2. Opening only one connection and calling the procedure multiple times with the different parameters.

BULK INSERTs are an option but probably overkill for what you want to do.

+1  A: 

What about UpdateBatchSize of the SQLDataAdaptor?

Our front end guys use this to batch a few 10,000 proc calls into chunks

Article

MSDN

Our environment disallows "bulkadmin" rights so we can't use BULKINSERT/bcp etc

gbn