views:

107

answers:

3

Hi, my windows app is reading text file and inserting it into the database. The problem is text file is extremely big (at least for our low-end machines). It has 100 thousands rows and it takes time to write it into the database.

Can you guys suggest how should i read and write the data efficiently so that it does not hog machine memory? FYI... Column delimiter : '|' Row delimiter : NewLine

It has approximately 10 columns.. (It has an information of clients...like first name, last name, address, phones, emails etc.)

CONSIDER THAT...I AM RESTRICTED FROM USING BULK CMD.

+3  A: 

You don't say what kind of database you're using, but if it is SQL Server, then you should look into the BULK INSERT command or the BCP utility.

Jeffrey L Whitledge
Hi, thats a another problem..earlier i was using SP using Bulk insert...problem is now I can't dump text file on our database server becasue of security reasons.. The whole plan is to read text file and write it into the temp table and call my existing SP which will finally write it into the database using our business logics
Novice
@Crawling - Perhaps you could talk the security folks into letting the database server access a network share that can be used for the purpose of bulk loading data. That would (likely) be way faster than individual inserts.
Jeffrey L Whitledge
A: 

Given that there is absolutely no chance of getting help from your security folks and using BULK commands, here is the approach I would take:

  1. Make sure you are reading the entire text file first before inserting into the database. Thus reducing the I/O.

  2. Check what indexes you have on the destination table. Can you insert into a temporary table with no indexes or dependencies so that the individual inserts are fast?

  3. Does this data need to be visible immediately after insert? If not then you can have a scheduled job to read from the temp table in step 2 above and insert into the destination table (that has indexes, foreign keys etc.).

Ralph Wiggum
Who says the file will fit in memory? Attempting to make it fit could either crash the program or bog it down due to swapping.
Steven Sudit
@Steven Sudit - That is a good point! Then the approach would be to chunk the file into something that will work on the 'low-end' machines.
Ralph Wiggum
@Ralph: Only prob, i am facing is read text file..if i read it at once it gives problem on low-end machines. Suggest me what should i do here?
Novice
If you're reading a stream, it's already "chunked".
Steven Sudit
A: 

Is it possible for you to register your custom assembly into Sql Server? (I'm assuming it's sql server because you've already said you used bulk insert earlier).

Than you can call your assembly to do (mostly) whatever you need, like getting a file from some service (or whatever your option is), parsing and inserting directly into tables.

This is not an option I like, but it could be a saver sometimes.

veljkoz
@Veljkoz: Its a long story. My app was on web server..working as file watcher...file comes in....read it and check integrity of the file and call stored procedure and pass ftp, file location information... now stored procedure comes back to the webserver...grabs the file using ftp call...and write the file back on Db server and then does the bulk insert etc... So now prob is SP using is windows FTP....which is not SSL. BEcause of security reasons...now SP's FTP call does not get completed. So i can't do much on DB server.
Novice
You can run FTP over SSH.
Steven Sudit