bulkinsert

Converting data in SQL Server

I am loading in a bunch of data that contains dates formatted as shown below: 00-JAN-0000 00-FEB-0000 00-MAR-0000 00-APR-0000 00-MAY-0000 00-JUN-0000 00-JUL-0000 00-AUG-0000 00-SEP-0000 00-OCT-0000 00-NOV-0000 oo-DEC-0000 SQL cannot automatically convert these dates because it does not recognize the month part of the date. I have appro...

How to bulk insert a CSV file into SQLite C#

I have seen similar questions (1, 2), but none of them discuss how to insert CSV files into SQLite. About the only thing I could think of doing is to use a CSVDataAdapter and fill the SQLiteDataSet, then use the SQLiteDataSet to update the tables in the database: The only DataAdapter for CSV files I found is not actually available: CSV...

Bulkinsert from CSV into db (C#) -> max number of rows in a web application?

Web application - C#, .Net, SQL 2k5. I recently used bulkinsert on an other application and I thought I would like to give it a try. I am going to receive a CSV file with 1000 rows, which will most likely add 500 000 (that is five hundred thousand) records in the database. I don't have any idea yet about this huge amount if it's going t...

How to compare two TXT files before send it to SQL

I have to handle TXT dat files which coming from one embed device, My problem is in that device always sending all captured data but I want to take only difrences between two sending and do calculation on them. After calculation I send it to SQL using bulkinsert function. I want to extract data which is different according to first fil...

Optimize master-detail insert statements

Quest After a day of running (against nearly 1 GB of data), a set of statements are tumbling down to 40 inserts per second. I am looking to increase that by an order of magnitude or two. SQL Code The code to insert the information comes in two parts: a master record and detail records. The master record: INSERT INTO MONTH_REF (DISTRI...

mongodb: insert if not exists

Hello, Every day, I receive a stock of documents (an update). What I want to do is inserting each of them if it does not exists. I also want to keep track of the first time I inserted them, and the last time I saw them in an update. I don't want to have duplicate documents. I don't want to remove a document which has previously bee...

BULK INSERT problem in MySQL

I get an error with the following SQL command for bulk insert. BULK INSERT libra.faculty FROM 'd\:faculty.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ); Here's the error message: ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your ...

Bulk Insert of hundreds of millions of records

What is the fastest way to insert 237 million records into a table that has rules (for distributing the data across 84 child tables)? First I tried inserts. No go. Then I tried inserts with BEGIN/COMMIT. Not nearly fast enough. Next, I tried COPY FROM, but then noticed the documentation states that the rules are ignored. (And it was hav...

Cannot bulk load. The file "c:\data.txt" does not exist.

Hi, I'm having a problem reading data from a text file into ms sql. I created a text file in my c:\ called data.txt, but for some reason ms sql server cannot find the file. I get the error "Cannot bulk load. The file "c:\data.txt" does not exist." Any ideas? The data file (yes I know the data looks crappy, but in the real world thats ho...

How to use SQLAlchemy to dump an SQL file from query expressions to bulk-insert into a DBMS?

Please bear with me as I explain the problem, how I tried to solve it, and my question on how to improve it is at the end. I have a 100,000 line csv file from an offline batch job and I needed to insert it into the database as its proper models. Ordinarily, if this is a fairly straight-forward load, this can be trivially loaded by just ...

What is the quickest way to import 60m records into SQL

I have a 5-6 tables in my database that I need to populate with test data to test peroformance in my app. I can write a code and do a bulk insert my prediction is that it will take nearly 3 days to run so I assumed there must be a quicker way. Any ideas? ...

Disable Primary Key and Re-Enable After SQL Bulk Insert

I am about to run a massive data insert into my DB. I have managed to work out how to enable and rebuild non-clustered indexes on my tables but I also want to disable/enable primary keys as I believe this will speed up the insertion process. NOTE: This is over numerous tables and so I assume I need some loop to get the primary key info...

Bulk Insert Multiple XML files with SSIS 2008

I have a folder with multiple XML files. I need to bulk insert each one into a table in sql server. I am at a complete loss as to how to get this to work, as I am new to SSIS. Currently, My SSIS package pulls the files off an FTP server and uses a command line to unzip the xml (the come as .xml.gz). This all works great, but now I'm ...

Loading large amounts of data to an Oracle SQL Database

Hey all, I was wondering if anyone had any experience with what I am about to embark on. I have several csv files which are all around a GB or so in size and I need to load them into a an oracle database. While most of my work after loading will be read-only I will need to load updates from time to time. Basically I just need a good ...

Does anybody have any suggestions on which of these two approaches is better for large delete?

Approach #1: DECLARE @count int SET @count = 2000 DECLARE @rowcount int SET @rowcount = @count WHILE @rowcount = @count BEGIN DELETE TOP (@count) FROM ProductOrderInfo WHERE ProductId = @product_id AND bCopied = 1 AND FileNameCRC = @localNameCrc SELECT @rowcount = @@ROWCOUNT WAITFOR DELAY '000:00:00.400' Approach #2: DECLARE @c...

Add data in bulk.

Hi all, I need your suggestion for this. I need to add data to mysql database through the admin interface, at initial i need to add data in bulk, so i thought of using csv upload but how to add images with csv i.e. when doing single add i insert name , description and a image via a form, but how to do the same for bulk. Thanks in adva...

What is the fastest way to get a DataTable into SQL Server?

I have a DataTable in memory that I need to dump straight into a SQL Server temp table. After the data has been inserted, I transform it a little bit, and then insert a subset of those records into a permanent table. The most time consuming part of this operation is getting the data into the temp table. Now, I have to use temp tables,...

import bulk data into MySQL

So I'm trying to import some sales data into my MySQL database. The data is originally in the form of a raw CSV file, which my PHP application needs to first process, then save the processed sales data to the database. Initially I was doing individual INSERT queries, which I realized was incredibly inefficient (~6000 queries taking almo...

How to import entities and its dependent objects via excel

We have a simple data model reflecting the following... User contains many @OneToMany addresses User contains many @OneToMany phones User contains many @OneToMany emails User belongs to a @Organization via @ManyToOne We would like the customers to capture all user information in a excel / csv sheet and provide it to our upload tool. ...

Bulk insert efficiency in NoSQL databases

I am developing a service that needs to perform bulk insert of tens of thousands of rows/items/objects at a time, about 20 times per second. (The NoSQL storage can be sharded, so inserts can work in parallel. The sharding strategy, and sharding in general, do not matter for this discussion.) The question is: which NoSQL products in yo...