Why not do this using Subversion ? The linked article details how the author synchronises and stores history using source control (you don't have to use Subversion, obviously - there are alternatives)
Looking at what you've done, this should work ... you just need to ensure that each client gets synced to the server after you're finished working on it. I use the following, which I invoke manually on a per-directory basis:
function syncDown() {
f=${1/\\/$/}/;
rsync -acuvz --exclude 'CVS' --exclude '*.class' --exclude '.classpath' server:projects/$f $f;
}
function syncUp() {
f=${1/\\/$/}/;
rsync -acuvz --exclude 'CVS' --exclude '*.class' $f server:projects/$f;
}
If you're looking for unattended, automated synchronization, then you're not going to get it: you'll always have race conditions where you work on one client but that work gets overwritten by a sync from another.
It looks like you probably already know this, but, just to emphasize the point for those who may see this question in the future:
rsync
only does one-way synchronization. If you want bi-directional sync, you need to use something else. (cvs
/svn
/git
/etc. would be appropriate "something else"s, but a revision control system may not be the optimal choice if you don't need an update history.)
In practical terms, this means if you're rsync
ing from A to B, then each sync will make the directory on B look exactly like the directory on A - any changes made on B since the last sync will be lost (barring excludes and with the caveat that rsync
will only delete files if --delete
is specified). This sort of arrangement with an authoritative master version which is then pushed out to other locations is appropriate in many cases, but any sort of collaborative work is not among them.
rsync
is good to keep one location in sync with a master. Or in other terms, mirror A to B. That's not what you're doing, though. You'd have to rsync
A to B and B to A. Which brings a whole new set of problems. If a file disappeared, do you need to delete in on the other side or rsync it back? Maybe it was modified on the other side; you can't check.
Anyway; the solution to this problem comes in the form of unison. That's a tool (works on Linux, OS X, Windows, BSD, ...) (has CLI tools, GUI tools, and can be scheduled nicely in cron
) which will keep your home directory or any other directory nicely in sync, and is made to be able to deal with almost any type of conflict or problem. Those people thought it all out way better than we could here.
Alternatively, there's SCMs. Many people use SCMs for managing their home directories. Subversion is popular for this, but I wouldn't recommend it at all. It will not only consume massive amounts of space, make everything horribly slow and force your keeping in sync on depending on an active connection to the master repository. There's alternatives, like GIT, and others, but they all have their downsides.
Either way, any SCM-based solution violates one very big rule of SCMs: You should never keep big binary data in there. SCMs are not made for this. You don't keep your photo collections, movies, documents, downloads, and stuff like that in an SCM, even though you may want to keep them in sync or keep a history on them (especially so for pictures/documents).
It's important to understand that there is a difference between keeping backups and keeping in sync. Your backups should be kept in a remote/detached location and can contain a history of everything you own. I personally recommend rdiff-backup for this. It keeps history of everything beautifully, uses the rsync
algorithm under the hood to minimize traffic and accessing the backup location looks like the most current state of the backup: You can just browse through it like you do normal files.
To summarize, I recommend you combine unison and rdiff-backup for an all-round solution to keeping your data safe and reliably in sync.