Executing thousands of independent INSERTs is going to run very slowly. Since MySQL is a multi-user, transactional database, there is a lot more going on during each query than Access does. Each INSERT operation on a SQL server goes through the following steps:
- Decode and parse the query.
- Open the table for writing, establishing locks if necessary.
- Insert the new row.
- Update the indexes, if necessary.
- Save the table to disk.
Ideally, you want to perform steps 1, 2, 4, and 5 as few times as possible. MySQL has some features that will help you.
PREPARE your queries
By preparing a query that you are going to use repeatedly, you perform step 1 just once. Here's how:
PREPARE myinsert FROM 'INSERT INTO mytable VALUES (?, ?, ?)';
SET @id = 100;
SET @name = 'Joe';
SET @age = 34;
EXECUTE myinsert USING @id, @name, @age;
SET @id = 101;
SET @name = 'Fran';
SET @age = 23;
EXECUTE myinsert USING @id, @name, @age;
# Repeat until done
DEALLOCATE PREPARE myinsert;
Read more about PREPARE at the mysql.com site.
Use transactions
Combine several (or several hundred) INSERTs into a transaction. The server only has to do steps 2, 4, and 5 once per transaction.
PREPARE myinsert FROM 'INSERT INTO mytable VALUES (?, ?, ?)';
START TRANSACTION;
SET @id = 100;
SET @name = 'Joe';
SET @age = 34;
EXECUTE myinsert USING @id, @name, @age;
SET @id = 101;
SET @name = 'Fran';
SET @age = 23;
EXECUTE myinsert USING @id, @name, @age;
# Repeat a hundred times
COMMIT;
START TRANSACTION;
SET ...
SET ...
EXECUTE ...;
# Repeat a hundred times
COMMIT;
# Repeat transactions until done
DEALLOCATE PREPARE myinsert;
Read more about transactions.
Load your table from a file
Instead of doing thousands of INSERTS, do one batch upload of your data. If your data is in a delimited file, such as a CSV, use the LOAD DATA statement.
LOAD DATA LOCAL INFILE '/full/path/to/file/mydata.csv' INTO TABLE `mytable` FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n';
Here's a link to the MySQL page on LOAD DATA.