Hi, I was wondering if there's anyone out here who has experience with write-intensive data due to file import.
The main requirement is for the business users to be able to import a data that represents relationships between master tables. They should be able to export the same in real-time ( as much as possible).
Setup:
- front-end (php) application writes on a MASTER database.
- Replication Setup - Master DB is replicated to two SLAVE DB servers.
- one of the SLAVE server is used as "read" database for front-end UI interactions (heavy queries).
- The same SLAVE server is also used for "EXPORTING" data based on a query that has been previewed on the front-end. (Lots of JOIN table).
The main challenge has been replication log. The users are not happy with the performance & data not being available on the front-end even though the files they've import has been processed already. Replication LAG is the culprit.
Moving to NoSQL i.e. is the LONG Term goal. Still want to improve the performance for now. By the way, the application is used internally but is hosted in a well-known hosting company. Number of users is around 150 users.
Imported data is around 200k - 800k lines. Each line represent a single row.
Any inputs would be appreciated :)