views:

25

answers:

2

Hi-- I have a set of tables where each table's ID key is auto_incrementing. Now my users want to import a bunch of external data that has different keys from a single csv flat file.

I think the best way to do it is to lock the tables, grab the next value for the auto_increment column for each table, then for each row I insert just swap out the existing primary key for the new AI value. This is going to get really ugly since the data to import has multiple tables and multiple key relationships.

Any other ideas on how to handle a problem like this? Stored procedure maybe? Thanks for any ideas.

+1  A: 

Add a new field to the table you want to import into called old_key, import the csvs into temporary tables and then do something like

insert into mytable(firstname, lastname, old_key) 
select fname, lname, old_key from temp_table

The mytable_id field that is auto increment will work as normal and now you will have the old_key and a new mytable_id that you can use for mapping other foreign key relations. You can drop the temporary tables after the work is done

Andrew
A: 

If you are not worried about what Primary Key values the imported rows will assume, just import all columns except the Primary Key column from the CSV file. mysql will assign a value depending on the current auto_increment value.

If the import is going to break foreign key relationships, then there is something to worry about. In this case its better to import the items row by row, capture the generated INSERT ID and update records as necessary.

Alternately, you can tweak the values in the data before import. For example if your existing data has an auto_increment = 5019 and the PK of your CSV data starts with 1, add 5018 to all PK values before importing. Add this same number to FK values in your CSV. Hope you can figure out the necessary arithmetic.

Salman A