I have run in to a slight problem. The story goes as follows:
I have a document archive system (written in PHP) which runs at multiple clients (23 at present). On their system they only have their documents. Every night, they all need to be 'synced' to a master database on site (central server). I have access to each MySQL database from the central server, so connecting to them is no problem.
I have a script that connects to the client database, selects all the entries from a table where the sync column = '0000-00-00 00:00:00' (default to indicate it wasnt synced). I would then iterate through each record, insert it to the central server, and set the sync time on the client database record to the time the script was executed. This works, but obviously has a large overhead with the multiple queries and I have just noticed the problems now.
Each client can generate up to 2000 - 3000 odd documents a day. With these large numbers it is taking way too long (1sec / 2documents).
Is there a better solution to my problem? Preferably a PHP scripted solution as I need to do logs to check if everything was succesful.
Thanks
EDIT: My current process is:
- Select all the un-synced data
- Begin transaction
- Insert record into central database server
- Select the document record from the client
- Insert the document into the central database server
- Update sync column on client
- Update sync column on server
- Commit transaction
This is a script run on the central server. Now that I come to think of it, i can remove step 7 and have it part of step 5, but that wont reduce the processing time by much.