views:

27

answers:

2

I have an ETL that is importing tables from Oracle to SQL 2008 using the OLEDB FastLoad. The data in Oracle is non-unicode. When the table is created in SQL it is created with unicode datatypes. For some reason the datatypes are being forced from non-unicode to unicode. Do any of you know of a way to stop this from happening? Possibly a Oracle driver problem?

A: 

This is not an answer but it is something you might want to try. Check the value of the NLS_LANG variable in the Oracle Database you are importing to. Changing this variable before running the ETL could help you.

Check the NLS_LANG faq here: http://www.oracle.com/technology/tech/globalization/htdocs/nls_lang%20faq.htm

Diego
Importing TO SQL FROM Oracle.
EWizard
Still a valid answer. Oracle will convert the characters in the database to the character set of the client querying the data (because back in the old days, some clients weren't up to displaying fancy characters).
Gary
Yes, still valid answer. I think though that I am going to have to take care of this on the SQL side inside SSIS.Thank you for the answer.
EWizard
A: 

I'm presuming you are using SSIS?

Guess what, SSIS wants everything to be unicode, so it assumes that all incoming data is in unicode. If you don't want it to be unicode, you will need to convert each field using a dataconversion task.

HLGEM
Yup. SSIS.The more I read the more I am realizing that this is the case.Thanks for the info.
EWizard
Just implemented a data conversion between the sources and destinations and it performed beautifully. Thanks for the heads up.
EWizard
FYI - we let our ETL operation run last night and the performance killed us! Our old process was move data from Oracle to Unicode SQL tables, then pump the data to Non_Unicode SQL tables (using implicit conversion). This way only takes about 2 hours to run.Well the new way, using the Data Conversion Transformation between the Source and Destination ran for over 8 hours. Fail.Geez SSIS is painful.
EWizard