I have an 18GB flat file (40,000,000 records), with fixed column widths (no field terminators), which I would like to read into a SQL Server 2008 R2 table. In addition to the text file with the data, I was given an Excel document with the field names and lengths. There are 270 fields with 465 total characters per record (per row). Using bcp I have created an fmt file, which looks fine to me.
10.0
270
1 SQLCHAR 2 1 "" 1 TitleCode SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 2 12 "" 2 FamilyID SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 2 12 "" 3 LocationID SQL_Latin1_General_CP1_CI_AS
etc.
In SQL Server I want to use this fmt file to read the data into the table:
BULK INSERT dbo.Customer2_noId
FROM 'C:\Uploads\dataFile_MICX\dataFile_MICX_Copy.txt'
WITH (FORMATFILE = 'C:\Users\kriss\SqlScripts\Customer2_noId-n.fmt');
GO
Error Messages from SQL Server:
Msg 4866, Level 16, State 7, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I have tried changing the filed terminator for the last field to "\r" and "\r\n" I have tried adding an extra field at the bottom:
271 SQLCHAR 0 0 "\r\n" 271 dummy SQL_Latin1_General_CP1_CI_AS
Makes no difference.
I have not been able to find anything on line which helps. (Extra blank line in fmt file is suggested, but that doesn't fix it. I think the data file has line terminators, because if I use the Excel data import tool, I see consistent-length lines.
Can someone help? Thanks, Kriss