I want to store test results in the form of a .CSV file into a central database. The DB has a schema that does not match the CSV file. Ideally, I would just pass the contents of the CSV file as a string directly to the server and have a function that can transform the CSV into a table that can be joined and used in an INSERT.
Here's a simplified example:
INSERT INTO MyTable
SELECT *
FROM parse_csv(csv_data) t
INNER JOIN Categories c
ON t.CatID=c.CatID
Note: csv_data is a varchar with the contents of a CSV file. Also notice the INNER JOIN operation done directly on the output of parse_csv. What's a quick way to write the parse_csv function() above?
I'm also thinking about using OPENXML and passing an XML string, but I can't find that function on Postgres. See this question.
I'd rather not parse the CSV file in the application code and call INSERT a thousand times. That could be a lot of unnecessary roundtrips. I know of the COPY function, but I have several CSV files of the same format that I don't want to collide.
I'm open to any suggestions or tips.