I am writing a stored procedure to handle the populating of a database from csv files. There about 150 tables and there will be 1 CSV file per table.
There is some added complication in that data needs to be validated against a validation table on import.
I am currently writing a single Store Procedure, that will take the csv file name, table name and validation table name as parameters and then handle the import from any csv file to any table. Obviously this saves having 150 store procedures, one for each table.
But my question is, is this the right way to do this? Having this generic procedure involves using quite a bit of dynamic SQL and from what I have been reading writing stored procedures where you don't know what tables you will be using, the schema of the tables etc until run time, is a bad thing. It will also require the use of a global temporary table. But then again writing 150 stored procedures that do a similar thing is also not a great solution, and that is only going to confuse the person doing the import when they have to know what stored procedure to call for which table.
This import will be a one time thing, so performance is not so much an issue, and the only persons running it will have full permissions to the DB, so security is not much of an issue. I'd like to hear your thoughts on what the best method is here.