views:

19

answers:

1

hi i run python program where im inserting many many new entries to database, this new entries are spread accross multiple tables.

what is the quickest way to load them into database? at now im using something like this

LOAD DATA LOCAL INFILE file 
INTO TABLE table 
FIELDS TERMINATED BY ',' 
LINES TERMINATED BY '\n' 
(col1, col2, col3, ...) 

but this solution is only for one table, and i dont feel like to do this multiple times ...

i found http://forge.mysql.com/worklog/task.php?id=875 this but dont quite sure if its already implemented or not.

either way? is this a best solution or you prefer other approach? my goal is speed, speed and speed ... cant afford to wait couple minutes while program finish to work

A: 

I don't think LOAD DATA can do that, but why not duplicate the table after importing?

See

Pekka
those new data im trying to write into mysql table arent already in table ... so duplicating table wont work
nabizan