views:

1172

answers:

5

I need to create multiple records in sqlserver, each with the same value in column A, but with a unique value in column B. I have the values for column B in an array.

I am using VS2008, aspnet, c# 3.5, sqlserver 2005.

Am I better off

Option 1.

Making 1 call to a stored procedure in sqlserver from c# code, and then doing all the processing work in the stored procedure in tsql?

This would involve combining all the values in the c# array into one comma delimited string and passing the string to tsql as a parameter, then looping and breaking the string apart into individual values and inserting a record for each one, all within a stored procedure.

From what I can see, this would involve easy rollback if necessary, but very clumsy string processing in tsql.

Or

Option 2.

Doing the looping in c# and passing the data as sqlparams from c# one record at a time to a stored proc to insert each record.

Ie, foreach ( int key in myarray) … insert a record

I could do this code in my sleep, but how would I be able to rollback if something happened in the middle of processing? And should I do the looping within in a singe connection.open and connection.close?

Anyone have any other options for doing this?

+7  A: 

this topic is extensively covered here: Arrays and lists in SQL 2005

Remus Rusanu
I was sort of hoping for an easy answer. Oh well. Thanks for the link.
Lill Lansey
Erland's article has been THE resource on the topic for years now, is a must read if you venture that space.
Remus Rusanu
The *real* easy answer would be: upgrade to SQL Server 2008 and use table-valued parameters :-)
marc_s
+2  A: 

Both options have their advantages (option 1 is a single round-trip, option 2 doesn't use hokey string processing), but I would likely end up going with option 2. Option 1 suffers from the size limits of varchars (8000 unless you use varchar(MAX); I have no idea what the performance would be on a comma-delimited varchar(MAX) string that's terribly long).

As far as rollback, yes, just do all of the operations on a single open connection and use a SqlTransaction object.

For example...

using(SqlConnection conn = new SqlConnection("connection string"))
{
    conn.Open();

    using(SqlTransaction trans = conn.BeginTrasnaction())
    {
        try
        {
            using(SqlCommand cmd = new SqlCommand("command text", conn, trans))
            {
                SqlParameter aParam = new SqlParameter("a", SqlDbType.Int);
                SqlParameter bParam = new SqlParameter("b", SqlDbType.Int);

                cmd.Parameters.Add(aParam);
                cmd.Parameters.Add(bParam);

                aParam.Value = 1;

                foreach(int value in bValues)
                {
                    bValue = value;

                    cmd.ExecuteNonQuery();
                }
            }

            trans.Commit();
        }
        catch
        {
            trans.Rollback();

            throw; // so the exception can propogate up
        }
    }
}
Adam Robinson
+1 I would use option 2 as well.
Jab
@Jab: Thanks, but you didn't actually +1 me ;)
Adam Robinson
That start's to get expensive if the array "B" (as mentioned in the question) is very big. -Not that I know of a better solution, but it's probably something to keep in mind.
AllenG
+2  A: 

The easiest way to implement this is using Option 1: passing the array as a delimited string. I used to do this in the pre-sql2005 days in conjunction with this TSQL Split Function. I would pass the array using "|" as a delimiter.

These days, I serialize the array into XML and then insert the contents into a table variable for processing using the sp_xml_preparedocument stored procedure.

I wouldn't use option 2 since it makes multiple calls to database.

Jose Basilio
Passing a delimited string can be troublesome if the upper limit on the size of the string can't easily be defined.
Yoopergeek
+1 - I've used the split function to pass arrays to my stored procedures. Although, I can't wait to upgrade to SQL2008 and use XML instead.
ichiban
You're still using sp_xml_preparedocument on SQL Server 2005? When it support XQuery etc?
gbn
sp_xml_preparedocument is available in SQL 2005 as well as 2008.
Scott Ivey
that looping split function is dog slow, use the number table method from: http://www.sommarskog.se/arrays-in-sql-2005.html#tblnum
KM
+2  A: 

Not sure if this fits your situation perfectly, but many times, when we need to pass an N-sized array of data into a stored procedure, we'll use a temp-table trick. Something alone the lines of:

using (SqlConnection connection = new SqlConnection(connectionstring)) {
   connection.Open();

   string sql = "CREATE TABLE #foo (myvalue [INT]) ";
   using (SqlCommand command = connection.CreateCommand()) {
      command.CommandText = sql;
      command.CommandType = CommandType.Text;

      command.ExecuteNonQuery(); // create the temp table

      foreach (int value in myValuesList) {
         command.CommandText = "INSERT INTO #foo ([myvalue]) VALUES (" + value + ") ";

         command.ExecuteNonQuery();
      }

      command.CommandType = CommandType.StoredProcedure;
      command.CommandText = "StoredProcThatUsesFoo";

      // fill in any other parameters

      command.ExecuteNonQuery();
   }
}
Yoopergeek
+1  A: 

If you're wanting to do multiple inserts in a loop in C# - look at TransactionScope. That will let you roll multiple calls to the stored proc into a transaction with rollback capabilities. Another option would be that you could pass your array as XML, and in the stored proc you could shred that XML out to a temp table to use in your proc.

One last thing you should do is to add Table Valued Parameters to your wish-list of reasons to upgrade to the next version of SQL server. As that wish-list grows, your justification for spending the money to upgrade gets a bit easier to make.

Scott Ivey
+1 my thought exactly - upgrade to SQL server 2008 and be done with this problem once and for all! :-)
marc_s