views:

45

answers:

2

Hi all,

I have developed a mac application, which is continuously interacting with database on linux server. As data is increasing it has become costlier affair in terms of time to fetch data from server. So I am planning to store required data locally, say on SQLite and find some mechanism through which I can synchronize it with database on linux server.

Can anyone suggest me some way to accomplish it?

Thanks,

Miraaj

+1  A: 

Perhaps you can formulate a caching strategy instead. Look at for example, memcached. Its difficult to say more without knowing your work load.

RdM
+1  A: 

If you are asking about synchronizing data between one source and another? Good luck. This has been a significant area of research for years, and to this day, it's mostly still application specific.

Software like SalesLogix would have a central database with a table like so:

site | key | name | address | phone | city | state | last_update

and each remote user would then have a different site code. When synchronization would occur based on the last_update fields, it would check the site and key to ensure against conflicts where two different remote users might accidently choose the same unique identifier for a record.

If you have something less complex, you could sculpt your data access layer to track transactions to the local database, and then replay them the next time you are connected. The problem here again is consistency.

If I change objectA.PropertyB to "test1", and you change it to "test3?" which one of us is more correct? How do you do collision detection?

I'm afraid I've yet to see of a single out-of-the-box solution to this problem that works across every data domain.

Chris Kaminski