I have a centrally hosted database (MS SQL Server) and distributed clients save data to it over the Internet. When the Internet connection goes down the client starts storing new data locally into a SQLite instance. When the Internet connection comes back online the accumulated local data is moved to the central db (inserted).
What's the best way to handle unique IDs in this scenario? Other important info:
Local clients can remember their local db IDs for linking purposes while offline; therefore local ids must be generated as globally unique.
a unique ID in a local db should not be changed when merged back into the central db.
Different kinds of databases might be chosen for local operations in the future depending on the device - e.g. a JavaDb, a TextFileDb etc; therefore I think no single proprietary db feature should be used for this solution.
- When changes are merged back to the central db all records must have unique IDs.
For this question assume a simple table named MyTable
with fields: ID
(the primary/unique key whatever type that should be) and Field1
, Field2
, Field3
The latter don't really matter.
What first came to mind is using GUID. Is there a pattern or practice that's better than this or a better way to carry this out?
Edit: Using Microsoft .NET and ADO.NET