views:

162

answers:

4

I have a customized data import executable in .NET 3.5 which the SqlBulkCopy to basically do faster inserts on large amounts of data. The app basically takes an input file, massages the data and bulk uploads it into a SQL Server 2000. It was written by a consultant who was building it with a SQL 2008 database environment. Would that env difference be causing this? SQL 2000 does have the bcp utility which is what BulkCopy is based on. So, When we ran this, it triggered a Deadlock error.

Error details: Transaction (Process ID 58) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

I've tried numerous ways to try to resolve it. like temporarily setting the connection string variable MultipleActiveResultSets=true, which wasn't ideal, but it still gives a Deadlock error. I also made sure it wasn't a connection time out problem.

here's the function. Any advice?

/// <summary>
    /// Bulks the insert.
    /// </summary>
    public void BulkInsert(string destinationTableName, DataTable dataTable)
    {
        SqlBulkCopy bulkCopy;

        if (this.Transaction != null)
        {
            bulkCopy = new SqlBulkCopy
                (
                    this.Connection,
                    SqlBulkCopyOptions.TableLock,
                    this.Transaction
                );
        }
        else
        {
            bulkCopy = new SqlBulkCopy
                (
                    this.Connection.ConnectionString,
                    SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.UseInternalTransaction
                );
        }

        bulkCopy.ColumnMappings.Add("FeeScheduleID", "FeeScheduleID");
        bulkCopy.ColumnMappings.Add("ProcedureID", "ProcedureID");
        bulkCopy.ColumnMappings.Add("AltCode", "AltCode");
        bulkCopy.ColumnMappings.Add("AltDescription", "AltDescription");
        bulkCopy.ColumnMappings.Add("Fee", "Fee");
        bulkCopy.ColumnMappings.Add("Discount", "Discount");
        bulkCopy.ColumnMappings.Add("Comment", "Comment");
        bulkCopy.ColumnMappings.Add("Description", "Description");


        bulkCopy.BatchSize = dataTable.Rows.Count;
        bulkCopy.DestinationTableName = destinationTableName;
        bulkCopy.WriteToServer(dataTable);

        bulkCopy = null;
    }
A: 

I use BCP quite regularly, and I haven't ever seen a case where the BatchSize was set to anything other than a typical value of 1000.

This field is not intended to represent the entire row count as shown in your code but to represent manageable data chunks to be sent to the server during the copy, sort of like an IP packet size.

You might try changing this value to 1000 instead of the entire table.

You might also want to look at the process/lock manager panes in the SQL Enterprise Manager or SQL Management Studio (depending on your client tool version) and see what the process is doing in terms of locks.

Bill
that BatchSize is actually set by a parameter in the app.config. We massages the flat file data into a temp table because we needed corresponding values from other tables for each rec, the size is determined by a var in config. I've tested the batch size at varying sizes from 50 - 50000 or so.
stevenjmyu
But your code segment above shows that it is being explicitly set in code to the DataTable.Rows.Count result, so where is this config file value coming into play, as I don't see it in your code.
Bill
A: 

Are these inserts being done concurrently? There is a known issue with SqlBulkCopy when doing concurrent inserts with the TABLOCK hint, on a table with clustered indexes. It causes a deadlock to occur. See the following:

http://msdn.microsoft.com/en-us/library/ms186341(SQL.90).aspx

Garett
no, it's not doing parallel import. The guy who wrote it was able to run it locally with a made up database, except it was 2008 and doesn't have a lot of data, we tried setting the SqlBulkCopyOptions.Defaults which does a row lock instead, but still seeing the deadlock.
stevenjmyu
A: 

Multiple Active Result Sets is irrelevant for inserts - I don't even think SQL Server 2000 supports it, since it was added later.

SQL Server 2000 does not have as sophisticated a lock escalation as later versions - I expect that's what you are seeing. I assume the consultant does not have workload other than BCP on the destination table, while your application does have activity on the destination table other than the bulk insert.

I would consider doing your bulk insert into a staging table first (so no chances of deadlocks there) and then as efficient as possible an insert/update query (possibly in many small batches) in a native SQL SP.

Cade Roux
A: 

I was finally able to get a local copy of our production database (~50 gigs) to test the application out. Turns out the dealocking was strictly an environment issue. Thanks fellas.

stevenjmyu
Congratulations! Glad to hear you were able to get it working.
Garett
So, what was the environment issue? How did you resolve this?
skimania