views:

59

answers:

2

I've got the following table schema:

CREATE TABLE `alexa` (
  `id` int(10) unsigned NOT NULL,
  `rank` int(10) unsigned NOT NULL,
  `domain` varchar(63) NOT NULL,
  `domainStatus` varchar(6) DEFAULT NULL,
  PRIMARY KEY (`rank`),
  KEY `domain` (`domain`),
  KEY `id` (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1

It takes several minutes to import the data. To me that seems rather slow as we're only talking about a million rows of data.

What can I do to optimize the insert of this data? (already using disable keys)

G-Man

A: 

Use LOAD DATA INFILE or the equivalent command-line tool mysqlimport. This can be 20x faster than any other method.

You can also read Speed of INSERT Statements in the MySQL manual. This has a lot of tips for improving bulk insert performance.

You don't say much about how you are currently inserting data (besides disabling keys), so it's hard to recommend anything more specific about how you can improve that. For example, what coding language are you using? Are you using prepared queries?

Bill Karwin
I'm using PHP and no, not using prepared queries. Dunno why I didn't think of that LOL.
GeoffreyF67
I have a script that loads the Stack Overflow XML data dump into a MySQL database. I use PHP and PDO, with prepared queries. It does take a few minutes to load. I do use mysqlimport as I normalize the tags per post, and that runs incredibly quickly.
Bill Karwin
A: 

Ensure bin log is off if you don't use replication:

set sql_log_bin=off;
Jürgen Hötzel