views:

40

answers:

1

Hi Guys,

I'm currently signed up with a third party service that hosts my mercurial repositories as a central hub to push my changes to as a sort of backup.

Now, I'm looking at a system to backup my laptop and am concidering Mozy. I'm a loan developer, and work on a laptop and am usualy connected to my internet via wifi with my laptop only really being on when I'm working, so feel something like Mozy is my best option.

My question is, if I'm the only developer, could I get away with just using local mercurial repos and using Mozy to backup everything up? Rather than pushing to an external repo?

Many thanks

Matt

A: 

Disclaimer: My experience is with git rather than hg, but as I understand it the concepts apply equally to both systems.

An advantage of backing up to a remote repo is that if your local repo becomes corrupted (perhaps due to a problem with the underlying filesystem), that corruption does not get transferred over to the backup, unless the files in your working tree themselves are corrupted.

For example, it's possible for some of the objects in the repository, perhaps those which are rarely accessed because you don't change them, to become corrupted. It could be months before you use one of those files again, and so months before you notice (though I think doing a garbage collect run, eg git gc, will detect corruption).

So if you are backing up by pushing commits, you're creating an independent version of those objects, and using checksums (ie the commit hash) to verify the transfer of any new files. Whereas if you are backing up to a backup provider, you're duplicating the actual objects in the repo, in whatever state they are in, and duplicating any changes to those files, including corruption of them.

Usually backup providers will give you rollback (spideroak seems to be particularly good for this) but you'll still have to sift through a lot of versions to figure out when the corruption happened; also with some providers, the rollback period is limited (especially for free accounts).

intuited
Ahhh, good point. My main reasoning is to keep costs down. Maybe I can look at pushing files to my webserver instead of a 3rd party.
Matt Brailsford
sshfs (if you have shell access but they don't have `hg` installed) or curlftpfs (if you just have ftp access) can be useful for this. I seem to recall some complications with curlftpfs though, something about it not being able to replace files normally/atomically.
intuited
You could still use mozy or any other backup provider by pushing to another local repository and then backing up that directory. This could be set-up using a commit hook to automatically push to this other directory.
mfperzel
@mfperzel: good point, the chances that both local repos would get corrupted in a non-noticeable way are pretty slim, even if they're both on the same filesystem.
intuited
Cheers guys, that sounds like a good option
Matt Brailsford