views:

118

answers:

4

I have like 50 txt files each with around 8 columns and 80,000 records. I have written a script in Matlab that reads all the files one by one and then puts it into MySQL using a single INSERT statement for each file. However doing so is taking a huge amount of time(even for a single file!). I have also tried using PHPmyAdmin which says the file is too big to upload (around 8 MB). So please suggest a workaround and what is the ideal way to usually import large txt files. Also how much time should it normally take to import 1 single such file?

+1  A: 

Try mysqlimport

Also table type of myisam will import faster, depends on if you need transactional support (innodb).

Joelio
start transaction before inserting and InnoDB catches up with MyISAM (don't forget to commit after ;) )
Mchl
A: 

Use multible INSERTS. It isn't that expensive, since you still connect to mysql only once. You can then use a loop to update your values ( -> your sql) and execute (i think it's "fetch" for mathlab) the query.

(see http://spx.arizona.edu/Projects/Database/Accessing%20mySQL%20through%20Matlab.pdf for example)

Fabian Fritz
+1  A: 

Use LOAD DATA INFILE syntax.

http://dev.mysql.com/doc/refman/5.1/en/load-data.html

Mchl
yeah, the LOAD DATA seems to work pretty quick. wonder why single insert for multiple records takes so much time!!
Gaurav
+1  A: 

All you need is in this manual: http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html

tszming