I have like 50 txt files each with around 8 columns and 80,000 records. I have written a script in Matlab that reads all the files one by one and then puts it into MySQL using a single INSERT statement for each file. However doing so is taking a huge amount of time(even for a single file!). I have also tried using PHPmyAdmin which says the file is too big to upload (around 8 MB). So please suggest a workaround and what is the ideal way to usually import large txt files. Also how much time should it normally take to import 1 single such file?
views:
118answers:
4
+1
A:
Try mysqlimport
Also table type of myisam will import faster, depends on if you need transactional support (innodb).
Joelio
2010-08-19 14:17:54
start transaction before inserting and InnoDB catches up with MyISAM (don't forget to commit after ;) )
Mchl
2010-08-19 14:28:49
A:
Use multible INSERTS. It isn't that expensive, since you still connect to mysql only once. You can then use a loop to update your values ( -> your sql) and execute (i think it's "fetch" for mathlab) the query.
(see http://spx.arizona.edu/Projects/Database/Accessing%20mySQL%20through%20Matlab.pdf for example)
Fabian Fritz
2010-08-19 14:25:57
yeah, the LOAD DATA seems to work pretty quick. wonder why single insert for multiple records takes so much time!!
Gaurav
2010-08-19 14:37:49
+1
A:
All you need is in this manual: http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html
tszming
2010-08-19 14:42:16