views:

1089

answers:

6

I currently have a daily process that loads a large amount of data from a TXT file into a ORACLE database, using a shell script that calls sql_loader. I want to migrate that to a .NET service, but don't want to rely on executing sql_loader from my service.

What is the best (and fastest) way to accomplish that?

+1  A: 

The ADO.Net provider for SQL Server supports a SqlBulkCopy function that mimics SQL Server BCP.

I don't know anything about the Oracle provider, but I would start looking to see if that provider supported a similar function to the sql_loader.

TGnat
A: 

I'd look at the 3rd party dotConnect libraries from DevArt (formerly CoreLab). Although I've not used their OracleLoader component specifically, I use their connection, command, datareader, and dataadapter objects daily, and have found them to be very fast indeed.

Hope that helps :o)

Andrew
+3  A: 

Load the data into a DataTable, and use the OracleBulkCopy class (from the Oracle Data Provider for .NET) to load it into the database all at once. This will only work if you are only inserting data into the database, you can't do updates with OracleBulkCopy.

Adam Hughes
A: 

If you have SSIS available, that would be a good tool. On the .NET side, I would recommend using the Oracle class mentioned above, but SSIS or any ETL tool would be a better choice.

Nathan
+2  A: 

I assume you don't like SQL*Loader because of it's command line interface, and (somewhat) clunky control files. But that's not the only way to use "SQL*Loader"

Oracle now has something called External Tables. You can see an example here.

Simply, you put a file into a Directory (a database object that defines a file system directory), you define the table parameters etc... just look at the example. Now all of the sudden your flatfile looks like a table to Oracle.

The you execute "INSERT INTO perm_Table SELECT * FROM external_table".

So now your .net app just renames files to be whatever the external table points to, then executes the INSERT and then renames the file back.

Voila.

You have your data loaded. It's all done with SQL, much much faster than ADO or any other library you can get to. No more clunky command line interface.

A: 

I actually solved my own problem using an alternative method: I wrote a stored procedure to load the data and used the UTL_FILE package. I don't know if this is the fastest method, but it is quite fast and very flexible (I can manipulate the text data as I wish, while loading).

Thanks for all the replies, I just posted this to show one more alternative for people having the same problem I had.

Rodrigo Sieiro