sqlbulkcopy

SqlBulkCopy causes Deadlock on SQL Server 2000.

I have a customized data import executable in .NET 3.5 which the SqlBulkCopy to basically do faster inserts on large amounts of data. The app basically takes an input file, massages the data and bulk uploads it into a SQL Server 2000. It was written by a consultant who was building it with a SQL 2008 database environment. Would that env ...

Questions About SQl BulkCopy

Hi I am wondering how can do a mass insert and bulk copy at the same time? I have 2 tables that should be affect by the bulk copy as they both depend on each other. So I want it that if while inserting table 1 a record dies it gets rolled back and table 2 never gets updated. Also if table 1 inserts good and table 2 an update fails tab...

How to insert null value for numeric field of a datatable in c#?

Consider My dynamically generated datatable contains the following fileds Id,Name,Mob1,Mob2 If my datatable has this it get inserted successfully, Id Name Mob1 Mob2 1 acp 9994564564 9568848526 But when it is like this it gets failed saying, Id Name Mob1 Mob2 1 acp 9994564564 The given value of type String from...

Why this strange behavior of sqlbulkcopy in a asp.net website running under iis?

I'm using SqlClient.SqlBulkCopy to try and bulk copy a csv file into a database. I am getting the following error after calling the ..WriteToServer method. "The given value of type String from the data source cannot be converted to type decimal of the specified target column." Here is my code, dt.Columns.Add("IsDeleted", t...

Possible to get PrimayKey IDs back after a SQL BulkCopy?

Hi I am using C# and using SqlBulkCopy. I have a problem though. I need to do a mass insert into one table then another mass insert into another table. These 2 have a PK/FK relationship. Table A Field1 -PK auto incrementing (easy to do SqlBulkCopy as straight forward) Table B Field1 -PK/FK - This field makes the relationship and is a...

How to find offending column? Can't Convert from string to int32

Hi I am using SqlBulkCopy. So I made a datatable and specifed it's columns then added rows to the datatable and then try to insert it. System.InvalidOperationException was unhandled by user code Message=The given value of type String from the data source cannot be converted to type int of the specified target I keep gett...

How to use SqlBulkCopyColumnMappingCollection?

Hi I want to make one SqlBulkCopy method that I can use for all my bulk inserts by passing in specific data through the parameters. Now I need to do mapping on some of them. I don't know how to make a SqlBulkCopyColumnMappingCollection since that was my plan to pass in the mapping collection in and use it. However I don't know how to m...

Fast insert relational(normalized) data tables into SQL Server 2008

Hello All, I'm trying to find a better and faster way to insert pretty massive amount of data(~50K rows) than the Linq that I'm using now. The data I'm trying to write to a local db is in a list of ORM mapped data serialized and received from WCF. I'm keen on using SqlBulkCopy, but the problem is that the tables are normalized and are a...

Effective way to keep SqlDecimal precision compatible for SqlBulkCopy

I've encountered the following InvalidOperationException when trying to insert a value of 91 to Numeric(19,4) row into an MS SQL table. {"The given value of type SqlDecimal from the data source cannot be converted to type decimal of the specified target column."} Inner Exception: {"Parameter value '91.0000' is out of range."} I've dis...

profile selected functions with VS profiler

I am profiling my code using instrumentation, but it takes a lot of time : About 6-7 minutes to run and then further 10-20 minutes to analyse. One of the major bottlenecks in performance is SQlBulkCopy and I wish to optimize the parameters I use for it. The data I run it on can't be generated easily and I wish to test it only in real en...

Import CSV into SQL Server with bulk insert - with child entities?

I need to import 100.000 rows from CSV file into SQL Server database. I use CsvReader to read the file. File contains flat data and child entities data like so: No Name Age ... Address1 Address2 ... AddressN 1 Alex 20 London Paris 2 Brian 30 New York Records will be inserted into main Clients table and child...

Error handling with SqlBulkCopy - could it be any harder?

Running very low on ideas here. I've got a case where I'm using SqlBulkCopy to pump data into a DB, and about halfway through I run into different exceptions (primary key violations, index violations, etc). I've confirmed that the violations are in fact true and need to be corrected in the data. What's infuriating, though, is that if I ...

SqlBulkCopy with Byte[] DataTable column error

Hello, I have a strongly typed dataset containing a datatable with one of the columns as a byte[] column that I'm trying to insert into a binary(4) database table field. I'm able to set the byte[] column values without any problems, but I receive the following exception when I run sqlbulkcopy on the datatable: "The given value of type ...

How do I make SqlBulkCopy work with MS Enterprise Library?

I've got some code which uses SqlBulkCopy. And now we're refactoring our code to use Enterprise Library database functions instead of standard ones. The question is how can I instantiate SqlBulkCopy? It accepts SqlConnection, and I only have DbConnection. var bulkCopy = new SqlBulkCopy(connection) // here connection is SqlConnection { ...

sqlbulkcopy mem. management

I'm using SQLBULKCOPY to copy some data-tables into a database table, however, because the size of the files I'm copying run sometimes in excess of 600mb, I keep running out of memory. I'm hoping to get some advice about managing the table size before I commit it to the database so I can free up some memory to continue writing. Here ar...

How to fix this stored procedure problem.

Hi I have 2 tables. The following are just a stripped down version of these tables. TableA Id <pk> incrementing Name varchar(50) TableB TableAId <pk> non incrementing Name varchar(50) Now these tables have a relationship to each other. Scenario User 1 comes to my site and does some actions(in this case adds rows to Table A). So I...

XSD: type of column x in table y is too small to hold data

I am generating an XSD file based on the columns in my xml. I give them all the type, "xs:string". Then I try to import the file into my database using .NET with SQLbulk import, but for some fields are to small. I get the message, "type of column x in table y is too small to hold data" What type should I use for large amount of text (s...

Skip some columns in SqlBulkCopy

I'm using SqlBulkCopy against two SQL Server 2008 with different sets of columns (going to move some data from prod server to dev). So want to skip some columns not yet existed / not yet removed. How can I do that? Some trick with ColumnMappings? Edit: I do next: DataTable table = new DataTable(); using (var adapter = new SqlDataAdap...

Validating Data Type before importing data from excel through SQLBulkCopy

In my asp.net website i have a functionality where user can import data from excel. I am using SQLBulkCopy to implement it. I have a instead of Insert trigger ( to check for duplicates) on the table in which the data is being imported. Following are 2 issues which i have. Question 1. When excel contains only one record which is duplic...

Can I use sql bulk copy to copy data within the same server ?

Can I use sql bulk copy to copy data within the same server ? ...