Obviously I can use BCP but here is the issue. If one of the records in a Batch have an invalid date I want to redirect that to a separate table/file/whatever, but keep the batch processing running. I don't think SSIS can be installed on the server which would have helped.
Create a trigger that processes on INSERT. This trigger will do a validation check on your date field. If it fails the validation, then do an insert into your separate table, and you can also choose to continue the insert or not allow it to go through.
an important note: by default triggers do not fire on bulk inserts (BCP & SSIS included). To get this to work, you'll need to specify that you want the trigger to fire, using something like:
BULK INSERT your_database.your_schema.your_table FROM your_file WITH (FIRE_TRIGGERS )
You're saying there's a column full of dates in the file, and you want that data to go into a column of type "datetime" in a table in a SQL database? And it'll blow up if one of the values from the file isn't a valid date? I just wanted to make sure I understand this right.
You could create another, temporary, table in the SQL database, of the same structure as the table you want the data from the file to end up in, but with every column of type varchar(255) or something. Sucking the data out of the file and into that table shouldn't fail whether any of the dates is valid or not.
Then, in SQL, you could massage the data however you want. You could use a cursor to select all of the records from the temporary table and loop through them. For each record, you could use the T-SQL ISDATE function to conditionally insert the values from the current record into one table or another.
I'm saying, get the data into the database and then run script like this:
// **this is untested, there could be syntax errors**
// if we have tables like this:
CREATE TABLE tempoary (id VARCHAR(255), theDate VARCHAR(255), somethingElse VARCHAR(255))
CREATE TABLE theGood (id INT, theDate DATETIME, somethingElse VARCHAR(255))
CREATE TABLE theBad (id INT, theDate VARCHAR(255))
// then after getting the data into [tempoary], do this:
DECLARE tempCursor CURSOR
FOR SELECT id, theDate, somethingElse FROM temporary
OPEN tempCursor
DECLARE @id VARCHAR(255)
DECLARE @theDate VARCHAR(255)
DECLARE @somethingElse VARCHAR(255)
FETCH NEXT FROM tempCursor INTO @id, @theDate, @somethingElse
While (@@FETCH_STATUS <> -1)
BEGIN
IF ISDATE(@theDate)
BEGIN
INSERT INTO theGood (id, theDate, somethingElse)
VALUES (CONVERT(INT, @id), CONVERT(DATETIME, theDate), somethingElse)
END
ELSE
BEGIN
INSERT INTO theBad (id, theDate)
VALUES (CONVERT(INT, @id), theDate)
END
FETCH NEXT FROM tempCursor INTO @id, @theDate, @somethingElse
END
CLOSE tempCursor
DEALLOCATE tempCursor
Yeah, if you are using DTS, you should just import into a staging table that uses varchar instead of dates and then massage the data into the proper tables afterwords.
The problem with What Matt said is that you should not use a cursor to manipulate the data afterwards especially if you have millions of records. CUrsoprs are extremely inefficient and should be avoided.
Use batch processing instead.
But by all means use his idea of a staging table. I wouldn' ever consider importing directly into a production table as too many things can happen over time to change the data in the input file and cause problems.