tags:

views:

417

answers:

11

I make extensive use of source control for anything that relates to a project I'm working on (source, docs etc.) and I've never lost anything that way.

However, I have had two or three crashes (spread over the last 4 years) on my development machine that forced me to reinstall my system and reconfigure my apps (eclipse, vim, Firefox, etc.). For weeks after reinstalling, I was missing one little app or another, some PHP or Python module wasn't there, stuff like that.

While this is not fatal, it's very annoying and sucks up time. Because it seemed so rare, I didn't bother about an actual solution, but meanwhile I've developed a mindset where I just don't want stuff like that happening anymore.

So, what are good backup solutions for a development machine? I've read this very similar question, but that guy really wants something different than me.

What I want is to have spare harddrives on the shelf and reduce my recovery time after a crash to something like an hour or less.

Thinking about this, I figured there might also be a way to use the backup mechanism for keeping two or more dev workstations in sync, so I can continue work at a different PC anytime.


EDIT: I should've mentioned that

  • I'm running Linux
  • I want incremental backup, so that it's cheap to do it frequently (once or twice a day)

RAID is good, but I'm on a laptop most of the time, no second hd in there, no E-SATA and I'm not sure about RAIDing to a USB drive: would that actually work?

I've seen sysadmins use rsync, has anybody had any experiences with that?

A: 

You can use RAID-1 for that. It’s the synchronize type, not the backup type.

kmkaplan
probably best option as it needs to be real time
willcodejavaforfood
+1  A: 

Cobian Backup is a reliable backup system for Windows that will perform scheduled backups to an external drive.

Simon C.
+1  A: 

You could create a hard drive image. Restoring from a backup image restores everything to the exact state that it was at the time you took the image. Or you could create an installer that installs just about everything needed.

baretta
This is ideal. For me, I create bootable backup images, so my downtime is nil.
Jweede
+3  A: 

You could create an image of your workstation after you've installed & configured everything. Then when your computer crashes, you can just restore the image.

A (big) downside to this, is that you won't have any updates or changes you've made since you created the image.

Bart S.
Actually, I like that, but the downside does seem rather big to me. Is there anything to do differential images?
Hanno Fietz
Periodically install the image, apply any updates, and "save" the new image alleviates the missing updates problem. Also, running in a virtual machine that has pretty much all available resources will make this easier
cdeszaq
A: 

I use RAID mirroring in conjunction with an external hard drive using Vista's system backup utility to backup the entire machine. That way I can easily fix a hard drive failure, but in the event my system becomes corrupted, I can restore from the E-SATA drive (which I only connect for backup).

Full disclosure: I've never had to restore the backup, so it's kind of like the airbag in your car; hopefully it works when you need it, but there's no way to be sure. Also, the backup process is manual (it can be automated) so I'm only as safe as the last backup.

This is a recipe for an epic FAIL. You must test your restore procedure, esp. when you're not stressed. Break your RAID mirror first, then test that you can restore the mirror correctly. Next, keep one drive safely to the side, and use the other drive to see that the system backup works.
Brandon Corfman
I have successfully rebuilt the RAID. As for the system backup, you've described an infinite loop. I backup my system, then restore it and then I'm in the same boat until I backup again. Each change between backups takes me further from a complete restore and each backup adds the possibility of being corrupted. Restoring every backup is like avoiding a car crash by not driving anywhere. I'm happy enough being better protected than 99% of computer users.
A: 

You can use the linux "dd" command line utility to clone a hard drive. Just boot from a linux cd, clone or restore your drive and reboot. It works great for Windows/Mac drives too.

This will clone partition 1 of the first hard drive (/dev/sda) to partition 1 of the second drive (/dev/sdb)

dd if=/dev/sda1 of=/dev/sdb1

This will clone partition 1 of the first hard drive to a FILE on the second drive.

dd if=/dev/sda1 of=/media/drive2/backup/2009-02-25.iso

Simply swap the values for if= and of= to restore the drive.

If you boot from the Ubuntu live CD it will automount your USB drives making it easy to perform the backup/restore with external drive(s).

CAUTION: Verify the identity of your drives BEFORE running the above commands. It's easy to overwrite the wrong drive if you not careful.

Chris Nava
To clone partitions, Partimage can also be used, http://www.partimage.org/Main_Page. It is better than dd because it does not back up disk space not occupied by files, and, optionally, can zero out this space on restore
dmityugov
+1  A: 

Since you expressed interest in rsync, here's an article that covers how to make a bootable backup image via rsync for Debian Linux:

http://www.debian-administration.org/articles/575

Rsync is fast and easy for local and network syncing and is by nature incremental.

Jweede
+2  A: 

I would set up the machine how you like it and then image it. Then, you can set up rsync(or even SVN) to backup your homedir nightly/etc.

Then when your computer dies, you can reimage, and then redeploy your home dir. The only problem would be upgraded/new software, but the only way to deal completely with that would be to do complete nightly backups of your drive(s).


Thanks, this sounds like a good suggestion. I think it should be possible to also update the image regularly (to get software updates / installs), but maybe not that often. E. g. I could boot the image in a VM and perform a global package update or something.

Hanno

prestomation
A: 

Guess this is not exactly what you are looking for, but I just document all what I install and configure on a machine. Google Docs lets me do this from anywhere, and keeps the document intact when the machine crashes.

A good step by step document usually reduces the recovery time to one day or so

dmityugov
Well, maybe I could "document" these steps by shell-scripting them and put that into source control. On Linux, it's command line calls anyway, most of the time. So instead of just typing "apt-get install foo", I also put it in a script.1 d would still be bad, at least that's what I think nowadays.
Hanno Fietz
A: 

If you use a Mac, just plug in an external hard drive and Time Machine will do the rest, creating a complete image of your machine on the schedule you set. I restored from a Time Machine image when I swapped out the hard drive in my MacBook Pro and it worked like a charm.

One other option that a couple of guys use at my company is to have their development environment on a large Linux server. They just use their local machines to run an NX client to access the remote desktop (NX is much faster than VNC) - this has the advantages of fast performance, automatic backup of their files on the server, and the fact that they're developing on the same hardware that our customers use.

gareth_bowles
For Mac a program like CarbonCopyCloner or SuperDuper is also useful for incremental, bootable backups
Jweede
A: 

No matter what solution you use, it is always a good idea to have a secondary backup, too. Secondary backup should be off-site and include your essential work (source code, important docs). In case something happens to your main site (fire at the office, somebody breaks in and steals all your hardware, etc.), you would still be able to recover, eventually.

There are many online backup solutions. You could just get a remote storage at a reliable provider (e.g. Amazon S3) and sync your work on a daily basis. The solution depends on the type of access you can get, but rsync is probably the tool you would use for that.

zvikico