views:

15

answers:

1

I have a distributed system which uses XML files in an SVN repository as a means of communication (as you may guess from that, communication normally is slow). Some processing nodes add or delete some data to a repository, another processing node takes this as an input event and reacts accordingly.

At the moment, everything's fine, but I'm wondering if I may run into problems with SVN auto-merge.

Let's say one node adds a specific job to another node by adding a line in an xml file pointing to another xml file with the job details. This looks like the following code:

----File: nodename.xml----
<assignments>
  <file>537f7acb382326.xml</file>
  <file>572919a90c0234.xml</file>
  <file>58a7c3e2015922.xml</file>
</assignments>

Every file listed in that file is a job assignment to the host nodename. On execution of the job the corresponding line will be deleted by the assignee.

From the perspective of the SVN repository, I have one "adder" and one "read-and-deleter". One node only adds lines to that file, one node only deletes lines from that file. I can guarantee that only one node may add and only one node may delete. But: Although the file names have unique IDs, they may be re-added, so a line may be added and deleted at the same time, which may theoretically cause problems.

My question is: Is there a guarantee that SVN automatically merges a concurrent "add line" and "delete line" without user interaction? If not, is there a guarantee if I add a timestamp to every added line, rendering each line unique?

A: 

That's a tricky question. I don't know the answer, but the best way to find out is to try it. Manually go through the steps that could cause a problem, and see how it behaves. It certainly wouldn't hurt to add a timestamp to guarantee uniqueness.

Adam Crume