tags:

views:

2660

answers:

6

This is a sql 2000 database that I am working with.

I have what I call a staging table that is a raw data dump of data, so everything is ntext or nvarchar(255).

I need to cast/convert all of this data into the appropriate data types (ie int, decimal, nvarchar, etc.)

The way I was going to do this was to iterate through all records using a while loop and attempt a CAST on each column on a single record during each iteration, after I visit a particular record I flag it as processed (bit field).

But how can I log the error when/if it occurs but allow the while loop to continue.

At first I implemented this using a TRY CATCH in a local SQL 2005 instance (to get the project going) and all was working well, but i learned today that the dev & production database that the international DBA's have set up is a SQL 2000 instance so I have to conform.

EDIT: I am using a SSIS package to populate the staging table. I see that now I must revisit that package and implement a script component to handle the conversions. Thanks guys

EDIT: I am doing this on a record by record basis, not a batch insert, so the transaction idea seems like it would be feasible but I'm not sure how to trap @@ERROR and allow the stored procedure to continue.

EDIT: I really like Guy's approach, I am going to implement it this way.

+2  A: 

What are you using to import the file? DTS has scripting abilities that can be used for data validation. If your not using DTS are you using a custom tool? If so do your validation there.

But i think this is what your looking for.
http://www.sqlteam.com/article/using-dts-to-automate-a-data-import-process

IF @@Error <> 0
 GOTO LABEL

@op
In SSIS the "red line" from a data import task can redirect bad rows to a separate destination or transform. I haven't played with it in a while but hope it helps.

jms
+1  A: 

It looks like you are doomed. See this document.

TL/DR: A data conversion error always causes the whole batch to be aborted - your sql script will not continue to execute no matter what you do. Transactions won't help. You can't check @@ERROR because execution will already have aborted.

I would first reexamine why you need a staging database full of varchar(255) columns - can whatever fills that database do the conversion?

If not, I guess you'll need to write a program/script to select from the varchar columns, convert, and insert into the prod db.

Blorgbeard
+1  A: 

Run each cast in a transaction, after each cast, check @@ERROR, if its clear, commit and move on.

FlySwat
A: 

You could try checking for the data type before casting and actually avoid throwing errors.

You could use functions like:

ISNUM - to check if the data is of a numeric type
ISDATE - to check if it can be cast to DATETIME

kristof
+2  A: 

Generally I don't like "loop through the record" solutions as they tend to be slow and you end up writing a lot of custom code.

So...

Depending on how many records are in your staging table, you could post process the data with a series of SQL statements that test the columns for correctness and mark any records that fail the test.

i.e.

UPDATE staging_table
SET status_code = 'FAIL_TEST_1'
WHERE status_code IS NULL
AND ISDATE(ntext_column1) = 0;

UPDATE staging_table
SET status_code = 'FAIL_TEST_2'
WHERE status_code IS NULL
AND ISNUMERIC(ntext_column2) = 0;

etc...

Finally

INSERT INTO results_table ( mydate, myprice )
SELECT ntext_column1 AS mydate, ntext_column2 AS myprice
FROM staging_table
WHERE status_code IS NULL;

DELETE FROM staging_table
WHERE status_code IS NULL;

And the staging table has all the errors, that you can export and report out.

Guy
A: 

kristof thanks for your post. I was unaware of that isDate function. made my day thanks. :)