views:

71

answers:

3

Web application - C#, .Net, SQL 2k5. I recently used bulkinsert on an other application and I thought I would like to give it a try.

I am going to receive a CSV file with 1000 rows, which will most likely add 500 000 (that is five hundred thousand) records in the database. I don't have any idea yet about this huge amount if it's going to work out well. I am afraid that it will time out.

I didn't do any testing yet, but I am pretty sure it would time out eventually.

Is there a way to make it not time out (I don't know ... split the bulkinsert into 1000 pieces :D) or I should try to do something like BCP, with a SQL job ...

A: 

Actually, splitting insertion seems a good idea. I would use bigger chunks, say 5000 or maybe more.

Daniel Dolz
this is what i was thinking about, but not sure how would that look like in the browser. "1 down 150 to go" :) I was thinking about splitting the file into 50 pieces of 10 000 records. make 50 files, import one, refresh the whole screen and show in a temp table, something like a "progress"
Swoosh
A: 

I suggest you don't try to process the file in a single web service call.

  1. Create an upload operation to accept the file and just save it somewhere and a GetProgress operation to allow the client to check the load progress / retry if necessary.
  2. Create a background process which will do the actual work, like a windows service.

-or-

You may be able to do both in one swoop if you create a WF/WCF Workflow Service, which you can host directly in IIS, like a web service, but it can perform multiple operations.

Doobi
+1  A: 

I recently developed smiliar bulkinsert functionality using c# and SqlBulkCopy class. To avoid page timeout I did asynchornous processing using ThreadPool (ThreadPool.QueueUserWorkItem method). The upload status is added to a log table using a new connection object to avoid transaction rollback for logs. This status is then reported in the website using a Upload History page.

The best solution would be to create a procedure which uses BulkInsert command. Here is a sample code for the proc:

CREATE PROCEDURE [dbo].[usp_ExecuteBulkInsertTask]  
(  
 @dataFile   VARCHAR(255),  
 @bulkInsertFormatFile  VARCHAR(255),  
 @tableName  VARCHAR(255)  
)  
AS 
BEGIN
BEGIN TRY
DECLARE @SQL Varchar(4000)  

 SET @SQL = 'BULK INSERT ' + @tableName  + ' FROM ''' + @dataFile + ''' WITH (formatfile=''' + @bulkInsertFormatFile + ''', FIRSTROW=2)'  
 EXEC sp_sqlexec  @SQL
END TRY
BEGIN CATCH
 --error handling
END CATCH
END
ARS