views:

44

answers:

3
+1  Q: 

TSQL Bulk Insert

Hi all,

I have such csv file, fields delimiter is ,. My csv files are very big, and I need to import it to a SQL Server table. The process must be automated, and it is not one time job.

So I use Bulk Insert to insert such csv files. But today I received a csvfile that has such row

1,12312312,HOME   ,"House, Gregory",P,NULL,NULL,NULL,NULL

The problem is that Bulk Insert creates this row, specially this field "House, Gregory" as two fields one '"House' and second ' Gregory"'.

Is there some way to make Bulk Insert understand that double quotes override behaviour of comma? When I open this csv with Excel it sees this field normally as 'House, Gregory'

+2  A: 

You need preprocess your file, look to this answer:

http://stackoverflow.com/questions/782353/sql-server-bulk-insert-of-csv-file-with-inconsistent-quotes

Michael Pakhantsov
The files are 200 Mb or more each and they are at network location...
anderhil
So what does size have to with it? Preprocessing is the answer if you can;t change the file. Or get your provider to properly give you the data with all the columns being surrounded by quotes or using something other than a comma for a delimeter (I send back any file where they try to use that, it is simply unaccepatble to send a comma delimited file.)
HLGEM
A: 

Take a look at this article: http://www.sqlteam.com/article/using-bulk-insert-to-load-a-text-file

coderboy
+1  A: 

If every row in the table has double quotes you can specify ," and ", as column separators for that column using format files

If not, get it changed or you'll have to write some clever pre-processing routines somewhere.

The file format need to be consistent for any of the SQL Server tools to work

gbn