views:

135

answers:

2

I have a 2 databases hosted on the different server. What the best thing do, to copy all the contents of master table database to the table of slave database? I am not the owner of the master database but they are willing to give an access. Before the data from master database is outputted via RSS and my PHP script parse it to insert into another server where another database is located, but due to huge data content it takes 24 hours to update and insert the data to remote database, that's probably because of 2 databases overhead. So what we plan is create a script that download the data from master database and save a local copy and then FTP to the 2nd server and dump the contents into the database. Is that advisable even though the size of the file either CSV or SQL is around 30MB and still growing? What is the best solution for this?

NOTE: all of the scripts from downloading, to FTP, to inserting in 2nd database is handled by cron for automatic update.

A: 

What type of database are we talking about? Have you looked into replication?

Chris Klepeis
Both MySQL database. Not yet, and I am still looking for any alternative but if replication is the best solution I'll go into it.
text
+2  A: 

You really should consider MySQL Master-Slave replication. This means every insert/update is als being done on the slave server. The master server needs to be configured to keep a (binary) transaction-log which the slave uses to keep track of updates.

Other than ease of use, replication also keeps the load low since it is a continuous process.

Wimmer
So is that mean nothing to worry about memory allocation, processing time etc. that will effect the slave server?
text
The slave server only has the extra "load" of the updates (insert, deletes, drop tables, reindex, etc.) initiated from the master. It's better than dump/restoring every x hours.
Wimmer