I'm writing a PHP script which imports data from tables in two different databases into another one. I've got it working ok with sample data, except now I've moved to using data closer resembling its final use: 25+ million records per table, and growing daily. Obviously, efficiency is a bit of a concern.
Here's how it current works. I copy the table structure, adding a couple of extra fields to maintain key integrity:
other1.someTable (field1, field2, field3) Pk = [field1, field2]
other2.someTable (field1, field2, field3) Pk = [field1, field2]
mydb.someTable (id, source, field1, field2, field3)
Pk = id, Unique key = [source, field1, field2]
And here's the SQL. It has a ON DUPLICATE KEY UPDATE
statement because this import needs to be done regularly, updating the data in "mydb". Thankfully, records won't be deleted from the "other" database (i think!).
INSERT INTO mydb.someTable (source, field1, field2, field3)
SELECT 1, field1, field2, field3 FROM other1.someTable
ON DUPLICATE KEY UPDATE field1 = field1, field2 = field2, field3 = field3;
INSERT INTO mydb.someTable (source, field1, field2, field3)
SELECT 2, field1, field2, field3 FROM other2.someTable;
ON DUPLICATE KEY UPDATE field1 = field1, field2 = field2, field3 = field3;
My question is this: Is this the best possible way to do this? Are there any other methods which might be faster, considering there are going to be millions and millions of records, totaling many gigabytes of data per table?