views:

167

answers:

1

Say the source data comes in excel format, below is how I import the data.

  1. Converting to csv format via MS Excel
  2. Roughly find bad rows/columns by inspecting
  3. backup the table that needs to be updated in SQL Query Analyzer
  4. truncate the table (may need to drop foreign key constraint as well)
  5. import data from the revised csv file in SQL Server Enterprise Manager
  6. If there's an error like duplicate columns, I need to check the original csv and remove them

I was wondering how to make this procedure more effecient in every step? I have some idea but not complete. For step 2&6, using scripts that can check automatically and print out all error row/column data. So it's easier to remove all errors once. For step 3&5, is there any way to automatically update the table without manually go through the importing steps?

Could the community advise, please? Thanks.

+1  A: 

I believe in SQL 2000 you still have DTS (Data Transformation Services) part of Enterprise Manager. Using that you should be able to create a workflow that does all of these steps in sequence. I believe it can actually natively import Excel as well. You can run everything from SQL queries to VBScript so there's pretty much nothing you can't do.

I used to use it for these kind of bucket brigade jobs all the time.

zenWeasel
Can you please suggest some keywords for me to search? thanks.
Stan
http://msdn.microsoft.com/en-us/library/aa176528%28v=SQL.80%29.aspxThis MSDN link has a pretty good overview of what DTS can do and how to create a package using the Enterprise Manager GUI. Using the Import Wizard you can directly import Excel as a datasource.It can do almost everything you need except for the manual inspection.The only thing I ever found counter-intuitive about it was the these jobs need to be saved on a SQL Server, not to a file on your local machine or anywhere else. So you can't check them into source control. Or at least I couldn't.
zenWeasel

related questions