views:

40

answers:

3

For example, I have a website with various types of information. If that goes down I have a copy of the same website the users use on a local webserver, like Apache or IIS on the client. They use this local version until the Internet version returns. They can have no downtime, in other words.

The problem is that over time the Internet version will change while the client versions will remain the same unless I touch each client's machine to make the updates. I don't want to do that.

Is there a good way to keep my client up to date so that when I make a change on the server the client gets a copy so they can run it locally if needs be?

Thank you.

EDIT: do you think maybe using SVN and timely running of the update by the clients would work?

EDIT: they'll never ever submit anything. It's just so I don't have to update the client by hand, manually going to the machine. they're webpages that run in case the main server is down.

A: 

There are tools like rsync which you can use periodically to sync the changes.

Teja Kantamneni
+1  A: 

Why not use something like HTTrack to make local copies of your actual internet site on each machine, rather then trying to do a separate deployment. That way you'll automatically stay in sync.

This has the advantage that if, at some point, part of your website is updated dynamically from a database, the user will still be able to have a static copy of the resulting site that is up-to-date.

JacobM
I haven't tried it but it is a good answer (thanks to all).
johnny
+2  A: 

I will go for Git over SVN because of its distributed nature. Gives you multiple copies of code; use it along with this comment's solution: http://stackoverflow.com/questions/420143/making-git-auto-commit/420172#420172 to autocommit.

AJ