views:

46

answers:

3

Hi,

The biggest hurdle I have in developing an effective backup strategy is being able to do some sort of offsite backup. Unfortunately, this can only be via uploading data to the offsite source but my internet cable has upload speeds which prohibit this.

Has anyone here managed to do offsite backups of large libraries of source code?

This is only relevant to home users and not in the workplace where budgets may open up doors.

EDIT: I am using Windows Vista (So 'nix solutions aren't relevant).

Thanks

+3  A: 

I don't think your connections upload speed will be as prohibitive as you think. Just make sure you look for a solution where your changes can be sent as diffs. Even if your initial sync takes days, daily changes would likely be more manageable.

Knowing a few more specifics about how much data you are talking about, and exactly how slow your connection is, I think would allow the community to make more specific suggestions.

jphofmann
I would say 200mb max of source code, but because it's just text it's not really a problem. However, I do backup music to Jungle and this is where the practical issues lie.
dotnetdev
If I may ask, what if any source repository are you using for your code? Depending on the solution you have chosen, it may make sense to pay a few dollars to get your repository hosted at a reputable firm that specializes in just that kind of thing. Loosing code makes me shiver. It is a small price to pay to get enterprise level backup. Even then I keep a local copy of my code to act as another off site backup for my host.As far as the music goes. How fast do you view a practical solution to be? To me, as long as I could back it up faster then I created it, it would be fast enough.
jphofmann
Also, you may have luck with this question on superuser.com stackoverflows sister site. Backup also falls squarely into general computer usage also.
jphofmann
A: 

Services like Mozy allow you to back up large amounts of data offsite.

They upload slowly in the background, and getting the initial sync to the servers can take a while depending on your speed and amount of data, but after that they use efficient diffs to keep the stored data in sync.

The pricing is very home-friendly too.

orip
A: 

I think you have to define backup and whats acceptable to you.

At my house, i have a hot backup of our repositories where I poll svn once an hour over the VPN and it takes down any check ins. this is just to catch any check ins that are not captured each 24 hours via the normal backup. I also send a full backup every 2 days through the pipe to be outside of the normal 3 tier backup we do at the office. our current zipped repository is 2GB zipped at max compression. That takes 34 hrs at 12 k/s and 17hrs at 24k/s, you did not say the speed of your connection, so its hard to judge if thats workable.

If this isnt viable, you might want to invest in a couple of 2.5" USB drives and load/swap them offsite to a safety deposit box at the bank. this used to be my responsibility but I lacked the discipline to do this consistently each week to assure some safety net. In the end it was just easier to live uploading the data to an ftp site at my house.

MikeJ