views:

364

answers:

1

Hi,

I have a requirement that I need to read an excel sheet using asp.net/C# and insert all the records into mysql table.The excel sheet consists of around 2000 rows and 50 columns. Currently,upon reading the excel records ,I am inserting the records one by one using a prepare statement into mysql table.But its taking around 70 secs to do so because of the huge data.

I've also thought of creating a new datarow, assigning values to each cell,adding the resulting datarow to datatable and finally calling dataadapter.update(...).But it seems to be complex because I got around 50 columns and hence I'll have to assign 50 values to the datarow.

Could someone please suggest if there is an alternate to improve the performance of the insertion?

Thanks

+2  A: 

MySQL LOAD DATA INFILE is akin to MS SQL's BULK INSERT and that is going to give you the best performance.

You can concatenate your text file while looping through the Excel cells, but for more performance gain, try exporting the file from Excel in one shot - I'm not sure how you'd need to finesse the export file (delimiters, etc.)

EDIT:

Or, if you don't have the stomach/time to try something like this, at least pass as many VALUES rows into a single INSERT statement as you can:

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9); (also from the MySQL site.)

I don't know whether 5000 is above the theoretical limit -- probably? -- so just assemble maybe 100 inserts per statement and then execute it.

At any rate, the single-insert-per-command approach is what's costing you.

LesterDove