views:

22

answers:

1

I have two ScrewTurn wiki documentation sites that are used for our system and user documentation. My idea is to create a Mercurial repository in each wiki site root directory. Then on a daily basis have a scheduled process add new files and commit changes to the repository and push the changeset to a backup repository.

I realize that, by default, ScrewTurn creates copies of all changed files and therefore has its own change tracking but I am considering turning that behavior off.

I beleive this would give me better version control than the default behavior and an automated backup.

Are there some considerations that I am missing? Is this a good idea? A bad idea?

+1  A: 

I don't know anything about screwturn, but so long as its files are stored as text and you can disable revision tracking then mercurial backups are a fine option. You'll of course only have access to revisions that existed at the time of your cron job, but that also means you won't ever lose more than 24hs editing work.

Incidentally, mpm, mercurial's primary author, has talked about using DVCS systems as the backends for wiki systems in the past and was generally not-in-favor of the idea. If I recall correctly his logic was that using a datastore that acquires a global lock for something that's changed only a page at a time doesn't make much sense. However, that would only apply if you were committing after each change; your plan to commit nightly doesn't have that problem.

Alternately, I'm a big fan of rdiff-backup, which does space-efficient nightly snapshots in a disk-browsable manner.

Ry4an
Thank you for your comments. You are right about only getting daily change history from this methodology. For this purpose I think it is adequate.
Jim Reineri
One thing I did not mention as part of the motive for using this mechanism is that I have an off-site Mercurial service in place for source code versioning and it is very easy to leverage the existing service rather than implement an additonal service.
Jim Reineri