views:

716

answers:

10

We're looking to automate our deployment of Web Applications, particularly when going from local development to a remote server.

Our current stack is LAMP remotely, MAMP locally, but I'm interested in general what people are using for this task regardless of their environment?

A: 

rsync->great tool

But, the answer depends on your enviro. What do you use for source control? What do you use for a build system? Etc.

Deployment for a web server is nothing more than a "cp" command depending on which files changed. You need to build a process that tracks the files that change, pull those files from source control and then pushes those changes. When you are dealing with PHP files, how do you know which files to push? That's the problem. You solve that, you'll be fine. The tool to cp the files and "deploy" them is the easy part.

Keith Elder
A: 

I still like FTP.

Optimal Solutions
+1  A: 

We use "svn export" when it needs to go live. Keeps our code under revision control, and lets us actively develop it on test boxes or our local computer.

J.J.
A: 

Mercurial

Aaron Maenpaa
A: 

I guess I should clarify: I'm not just talking about moving files around, I also meant considering other tasks such as:

  • Setting up Database schema

  • Managing configurations

  • Misc tasks required for deployment (creating log files etc.)

Mathew Byrne
+2  A: 

When and where possible, I prefer an automated deployment such as with Ant, even FTP deployment can be fairly easily handled. Automating the deployment, much like an automated build, takes the guess work and error out of the process and by definition provides at least the bare minimum documentation necessary (i.e. the build script) for a new programmer to understand the process.

Joe Skora
A: 

I'm a .NET guy so for us it is CruiseControl + nant

Ive heard great things about Jet Brains TeamCity though.

JoshReedSchramm
+1  A: 

I haven't tried it yet but I'm looking at using Fabric in future:

Fabric is a simple pythonic remote deployment tool.

It is designed to upload files to, and run shell commands on, a number of servers in parallel or serially. These commands are grouped in tasks (regular python functions) and specified in a 'fabfile'.

It is a bit like a dumbed down Capistrano, except it's in Python, dosn't expect you to be deploying Rails applications, and the 'put' command works.

Unlike Capistrano, Fabric want's to stay small, light, easy to change and not bound to any specific framework.

Swaroop C H
More detail on fabric: http://stackoverflow.com/questions/1233655/what-is-the-simplest-way-to-ssh-using-python
hughdbrown
+2  A: 

One of the things used in a previous company was - believe it or not - RPM files. When we built our software, all the various parts of it would be packaged into RPM files, which were then deployed to the server.

  1. Master servers in a cluster had a list of all servers and their roles, which would be used to determine what packages each server needed.
  2. The deploy phase would check versions on each server and determine which servers needed upgrades. Each server would get a copy of any new packages it needed,
  3. Each server would have its packages installed by the deploy script, which would manage pre-installation and post-installation checks and tasks.
  4. The deploy script would trigger a separate process, the configuration management system, to read the configuration templates to generate configuration files for any services a server needed (based on its list of roles), and farm those out to the servers
  5. The deploy system would generate a list of actions that needed to be taken (services to be restarted) for each system, and present those to the operator managing the update. The operator would then either perform the restarts (if the update was occurring during the client's scheduled maintenance window, or we had a work-order for mid-day service restarts), or create a ticket for the night staff with a list of tasks to be done.

RPM is a horrific hack, but as our clients were all running Red Hat Linux (by our requirement), it made perfect sense. If I had a choice, I'd go with a system like Debian or Ubuntu, and set up a repository that the systems could all pull from. Still, it worked well for hundreds of clients, with thousands of servers total. Pretty neat.

Dan Udey
+1  A: 

Capistrano works very well for this kind of thing. It came out of the Ruby on Rails ecosystem, and was initially very strongly tied to deploying Rails apps. Since a lot of people had noticed that it was handy for remote server control, it's become a bit more general-purpose.

With no extra setup, Capistrano:

  • Uses SSH to connect to the application servers
  • Checks out the latest source code from Subversion to a new, dated, folder
  • Activates the new release by updating a symbolic link or two
  • Reloads the application server

And all this with rollback functionality.

Another good option would be to use your operating system's packaging system (RPM, deb/apt, etc). This tends to require a good level of familiarity with your operating system and its policies, but fits in great with other tools if you know what you're doing.

RJHunter