bulkinsert

Generate XSD from SQL Server database in order to import XML data

Is there a tool to generate an XSD schema from SQL Server database ? This XSD would be used for importing XML data into database with BULK INSERT or bcp ...

Is there a way to bulk insert into two tables with FK from one to the other?

I'll give a pseudocode example of my current method and if anyone knows of a method that doesn't work one row at a time, I'd be quite appreciative. I'm using MS SQLServer 2008 define cursor for the data to be inserted (about 3 million records) loop ( insert record into table 1 use scope_identity() to get key insert record into tab...

SharePoint Batch Insert not working

Hello All, I have a requirement in my webpart wherein I have to insert large set of items in a SharePoint list. I made use of the batch insert methodology in order to achieve this. However, this never inserts the items into the list. Here is my XML which contains the items to be inserted in batch <?xml version="1.0" encoding="UTF-8"?...

Custom SSIS Destination Adapter for Bulk Insert

Hi, I am trying to build a custom SSIS component which combines the functionality of Conditional split transformation and OLEDB Destination Fast Load. in theory, I want to pass a variable to my component and that variable will control my component whether it should run or not according to its value. I am reading http://media.techtarget....

In a Sql Bulk Insert statement, can we use relative path(files\a.txt) instead of absolute path(c:\abc\a.txt) or networked universal path (\\abc\a.txt)

I wanted to insert data in a table from a text file where it is stored in csv format to a sql server table. For that, I am using bulk-insert statement. Now I need to specify the file name in "From" clause. I don't want to use networked locations or local locations over there. I want to upload my text file in the same directory as my exec...

Why is NHibernate refusing to batch inserts?

Using NHibernate 2.1.2.4000 against SQL Server 2008. The target table has no triggers or extraneous indexes. It is simply: create table LogEntries ( Id INT IDENTITY NOT NULL, HostName NVARCHAR(32) not null, UserName NVARCHAR(64) not null, LogName NVARCHAR(512) not null, Timestamp DATETIME not null, Level INT not null,...

Bulk Insert expected performance

Currently we're doing some bulk loading using IRowsetFastLoad and we're getting about 50,000 rows per second. My gut tells me that's low and given the upper bound of the data sizes (around a billion rows), it would be real nice to get that 50K as big as possible. Does anyone have metrics on what we should expect from IRowsetFastLoad? ...

SQL Server Bulk Insert from Fixed Width File- How do I get NULLs for strings?

I'm doing a bulk insert from a fixed width text file using INSERT INTO blah SELECT blah1, blah2 FROM OPENROWSET(BULK 'filename.txt', FORMATFILE='format.xml'); It's working fine, except that I want NULLs for the empty (all spaces) fields in the file. This is no problem for fields marked in the format file as SQLINT, SQLDATETIME, et...

GTK treeview insert/update performance problem because of sorting

I'm having performance problems when inserting many rows into a GTK treeview (using PyGTK) - or when modifying many rows. The problem is that the model seems to get resorted after each change (insert/modification). This causes the GUI to hang for multiple seconds. Leaving the model unsorted by commenting out model.set_sort_column_id(SOME...

Bulk inserts and duplicate records with LINQ to SQL

Is there a "best practice" way of handling bulk inserts (via LINQ) but discard records that may already be in the table? Or I am going to have to either do a bulk insert into an import table then delete duplicates, or insert one record at a time? 08/26/2010 - EDIT #1: I am looking at the Intersect and Except methods right now. I am ...

Script restarts executing after a while

Hi, I'm busy with a project in cakePHP where I need to parse a couple of XML files and insert the relevant data in my mysql database. The script inserts what it should insert, that's not the problem. For example if I parse one or 2 files (approx 7000-8000 records), nothing goes wrong. Problems start when I parse the third or fourth xml...

Is it possible to use sql server bulk insert without a file?

Curious if this is possible: The app server and db server live in different places (obviously). The app server currently generates a file for use with sql server bulk insert. This requires both the DB and the app server to be able to see the location, and it makes configuration more difficult in different environments. What I'd like to...

Faster SQL Inserts?

Hi, I'm dealing with chunks of data that are 50k rows each. I'm inserting them into an SQL database using LINQ: for(int i=0;i<50000;i++) { DB.TableName.InsertOnSubmit ( new TableName { Value1 = Array[i,0], Value2 = Array[i,1] } ); } DB.SubmitChanges(); This takes about 6 m...

Efficient way to alter 100GB table

We have a number of databases which store 10s to 100s of gigabytes of data in one of the tables. It contains image data. The problem is that a lot of these databases were created improperly. Basically the primary key isn't actually a primary key. The were created with a unique index on a nullable column. And some of them have an int...

Optimize massive MySQL INSERTs

Hi! I've got an application which needs to run a daily script; the daily script consists in downloading a CSV file with 1,000,000 rows, and inserting those rows into a table. I host my application in Dreamhost. I created a while loop that goes through all the CSV's rows and performs an INSERT query for each one. The thing is that I get...

What's the most efficient way to bulk-copy to SQL Server from Java?

I have data that is streamed from disk and processed in memory by a Java application and that finally needs to be copied into SQL Server. The data can be fairly large (hence the streaming) and can require up to several 100,000 rows to be inserted. The fastest solution seems to be using SQL Server's bulk-copy feature. However, I haven't f...

How to Bulk Insert csv with double quotes around all values?

I am trying to insert a .csv file into SQL Server 2008 R2. The .csv is 300+MB from http://ipinfodb.com/ip_database.php Complete (City), 4.0M records. Here're the top 5 lines, with 1st line = column headers: "ip_start";"country_code";"country_name";"region_code";"region_name";"city";"zipcode";"latitude";"longitude";"metrocode" "0";"RD"...

MySQL Bulk Insert of Geometry fields

I have a mysql database that I'm trying to populate from a text file. The contents of my file look like (as just some examples. there are thousands of rows) 1:GeomFromText('Polygon(0 0, 1 1, 2 2, 0 0)') 2:GeomFromText('Polygon(0 0, 1 2, 2 2, 0 0)') In my schema, the first field is an integer and the second is GEOMETRY I try to load t...

Multiple rows for insert command - Apostrophe problem

I'm trying to insert multiple rows using SqlCommand from C# to SQL Server. I'm forming a simple query as below: Insert into temp(field1, field2) values (1, 'test'), (2, 'test1'), (3, 'test2') and so on till 100 rows. For the example purpose I only gave couple of fields here but it actually contains 25 fields and 20 out of this are strin...

Cannot bulk load because the file could not be opened. Operating system error code 1326(Logon failure: unknown user name or bad password.).

bulk upload from csv test file "\servername\wwwroot\Upload\LDSAgentsMap.txt" SET QUOTED_IDENTIFIER ON SET ANSI_NULLS ON GO CREATE PROCEDURE [dbo].[sp_CSVTest_BulkInsert] ( @Path NVARCHAR(128) ) AS DECLARE @Sql NVARCHAR(256) SET @Sql = 'BULK INSERT CSVTest FROM ''' + @Path + ''' WITH ( FIELDTERMINATOR = '','', ROWTERMIN...