views:

69

answers:

2

I have a directory of python programs, classes and packages that I currently distribute to 5 servers. It seems I'm continually going to be adding more servers and right now I'm just doing a basic rsync over from my local box to the servers.

What would a better approach be for distributing code across n servers?

thanks

+1  A: 

First, make sure to keep all code under revision control (if you're not already doing that), so that you can check out new versions of the code from a repository instead of having to copy it to the servers from your workstation.

With revision control in place you can use a tool such as Capistrano to automatically check out the code on each server without having to log in to each machine and do a manual checkout.

With such a setup, deploying a new version to all servers can be as simple as running

$ cap deploy

from your local machine.

Pär Wieslander
+3  A: 

I use Mercurial with fabric to deploy all the source code. Fabric's written in python, so it'll be easy for you to get started. Updating the production service is as simple as fab production deploy. Which ends ups doing something like this:

  1. Shut down all the services and put an "Upgrade in Progress" page.
  2. Update the source code directory.
  3. Run all migrations.
  4. Start up all services.

It's pretty awesome seeing this all happen automatically.

sdolan
@Sdolan: `fabric` - I've been looking for this my entire adult life.
Brian M. Hunt