I've got a large-ish MySQL database which contains weather information. The DB is about 3.5 million rows, and 486 MB in size.
I get a refresh of this data every 6 hours, in the form of a mysql dump file that I can import. It takes more than 2 minutes to import the data and probably a similar amount of time to create the index.
Any thoughts on how I can import this data while still keeping the DB available and not losing responsiveness? My first thought was two have two databases within the same MySQL instance. I'd be running off DB1 and would load data into DB2 and then switch. However, I'm concerned that the load process would make DB1 unresponsive (or significantly slow).
So, my next thought is two have two different MySQL instances running, on different ports. While DB instance 1 is serving queries, DB instance 2 can be loaded with the next dataset. Then on the next query, the code switches to DB2.
That seems like it would work to me, but I wanted to check with some who have tried similar things in the past to see if there were any "gotchas" I was missing.
Thoughts?