I am beginning to design a new laboratory test data management system with many (about 30) test stations.
The system must be able to collect data offline in the event of a network outage.
Each station would keep an up-to-date, read-only copy of test structures (specifications, entity types, business rules/workflows, etc) but not test data. The actual test data would be stored locally if the server cannot be found.
There would need to be some sort of synchronization to prepare for a network outage. The synchronization would pull updated test structures. Another synchronization would push unsaved test data.
How do you recommend I achieve this? Any caveats?
Ideas / Thoughts:
- Install SQL server on each machine and write scripts to synchronize the server and clients (seems expensive and overkill).
- Save the local copy of data in application defined raw data files with synchronization scripts.
- Is there anything built into SQL Server to have the clients be able to collect data offline?
- Save all data locally, then click a "Transfer" button to push data to the network.
Environment: MS SQL Server running on Windows Server 2008