tags:

views:

98

answers:

4

i have a table in an access database

this access database is used on a regular basis, basically from 9-5

someone else has a copy of this exact table. sometimes records are added, sometimes deleted, and sometimes data within the records is updated.

i need to update the access database table with the offsite table every hour or so. what is the best algorithm of updating the data? there are about 5000 records.

would it severely lock up the table for a few seconds every hour?

i would like to publicly apologize for my rude comment to david fenton

+1  A: 

If a few seconds performance is critical, you'd rather move to a better database engine (like Sqlite, MySQL, MS SQL server). If you want a single file, then Sqlite is the best for you. All these use by-single-record locks, so you can read and write simultaneously.

If you stay with access, you will probably have to implement a timer to update only a few records at a time.

Pavel Radzivilovsky
I am one of Sqlites biggest fans, but I am not sure I would recommend it for this problem. The network access (ODBC or whatever) is not an integral part of the DB but a third party add on, also, locking is at FILE level not row level.
James Anderson
I agree, but - he doesn't use network, does he?
Pavel Radzivilovsky
yes i am actually using a network
I__
+1  A: 

Before you do anything else you need to establish the "rules" as far as collisions go.

If a row in the local copy is updated and the same row in the remote copy is updated which one is the "correct" version? Ditto for deletions, inserts are even more of a pain as you can have the "same" set of values but perhaps a different key.

After you have worked out how to handle each of these cases you can then go on to thinking about the implementation.

As other posters have suggested the way to completely avoid these issues is switch to SQLServer or any other "proper" database which can be updated over the network by all users and where concurrency issues are handled by the DBMS when the updates are applied.

James Anderson
+1  A: 

Other users have already suggested switching to a server based database i.e. SQL server etc. I would echo this and say it is the best way to go however if you are stuck with access and have no choice then I would suggest you add a field (with an index) along the lines of “Last Updated”. You could then export all records that have been modified within a particular time frame. Export this file as a CSV, ship it over to the remote site and import it into the “master” access database. With a bit of scripting you could automate this process.

Kevin Ross
+1  A: 

My impression is that this question ties together pieces you've been exploring with your previous questions:

  1. a file "listener" to detect the presence of a new file and do something with it when found
  2. list files with some extension in a folder
  3. DoCmd.TransferText to pull file data into your database
  4. Insert, Update, Delete records in a table based on an imported set of records

Maybe it's time to give us a more detailed picture of what you're dealing with.

Tony asked if both sites are on the same WAN (Wide Area Network). You replied they are on Windows. Elsewhere you said you're using a network. Please tell us about the network.

I'm still unsure whether you need a one-way or two-way data exchange. You've talked about importing changes from the remote table into the local master table. Do you need to do the same type of operation at the remote site: import changes made to the table at the master site?

Tell us what needs to happen regarding the issue James raised. Can local and remote users ever edit the same record? If they can, how will you resolve the conflict? Similarly, what should happen if a remote user updates a record and a local user deletes their copy of that record?

Based on what you've told us so far, this sounds like a real challenge for Access, made more challenging by the rate of record changes (5,000 per hour). I like the outline Kevin suggested. However your challenge will be more complicated since you also need to account for record deletions at both sites.

It seems like you may have to create something which duplicates Access' Replication feature. Maybe you should look at the Jet Replication Wiki to see if you can modify your design to take advantage of Replication. I can't help you there, and unfortunately you appear to have frustrated David Fenton who is a leading authority on Jet Replication.

HansUp
Jet Replication is likely inappropriate for that volume of updates. Using indirect replication, it could certainly be done (and it uses its own dropboxes and "listener" process to exchange information), but the maximum granularity for a scheduled synch is 15 minutes. If you need lower latency than that, it's not the right solution. In fact, I'd say anything short of a single centrally accessible database is not the right solution.
David-W-Fenton
Thanks, David. I sure do appreciate your contributions in this forum and the newsgroups.
HansUp
so do i ! thank you david
I__
I don't know if it's due to @HansUp's link to the Jet Replication Wiki, but I just checked and it now comes up before MS's Replication pages in Google when you search on "Jet Replication." Of course, MS is probably purging as much replication documentation from their websites as possible, so maybe not... "Access Replication" still pretty much ignores my Wiki, even though I include that term in the title of every page.
David-W-Fenton
@David I wonder if you might get more visibility for the wiki by substituting "MS Access" for "Microsoft Office Access" on those pages.
HansUp
Well, until Office 2010, that was their name for the product. At least, based on the shortcuts and the MS website. It's a moving target, and I'm not really interested in investing time in it!
David-W-Fenton