views:

432

answers:

8

Having the "one push build" to take your changes from development environment to live server is one thing that is very nice to have and often advocated.

I came on board with a small team running in a LAMP stack and use SVN for version control, currently deployed on a single production server (another server for development and soon to be a separate mysql server). I'm just now putting in place a lot of the organizational things that have been missing prior to me coming on board.

I am curious to see

  1. how people are doing this (one step build) currently
  2. see how i can implement it best for my situation (small team, LAMP environment with SVN)

Some particular challenges I'm interested in would be handling database changes (schema) and also if and what kind of "packages" are people using to keep things organized (e.g. RPMs, PEAR, etc.).

+5  A: 

We used ant with Hudson. Worked like a charm.

Hudson would work with other build systems too, and not just java projects. It lets you setup multiple build targets, and will run them automatically, or manually. It also forces you to implement a way to run your build from a single command.

It doesn't solve the communication problems where the server will be unavailable during the time it takes to run the build for the deployed server.

For our schema updates and changes, we setup our ant script to do two things:

  1. Update run the schema only if there's a difference in SVN.
  2. Check in a schema dump after the schema changes were built.
  3. If there was no update to the schema, simply use the dump to load a database

It did take a few tries to get right, but suddenly we had solved the issue of multiple developers being on different schemas. It was so easy to import the dump to update your development schema, that you could do it daily.

Kieveli
Hudson is awesome.
stimms
+2  A: 

We do. We use a product called Anthill Pro to do all of our builds and deployments. It has a workflow process which set up to check out the files, do the builds, run the unit tests and then deploy the code to the servers. You can use it to deploy just about anything as the process can run command line programs etc.

Kevin
+1  A: 

"make" on UNIX (and Windows) is your friend. It's got a learning curve though, but it's worth it. You can have make update the source, compile, test, etc, etc.

Kurt
+1  A: 

I don't think there's a simple cookbook answer for this, because this depends very much on your environment. Whatever you come up with, I highly recommend a script-based approach, with the deployment scripts being in source control themselves. Those scripts will also allow better integration with build solutions (see below).

The simplest such script to run in your production environment would simply be the command to get latest (or get a specific version) from source control.

The next challenge is database deployment. The solution that I have come to like the most for small to medium-sized projects is to maintain a schema version table in each database and have all DDL and data update scripts in source control (including the data sources they use in compressed archives). The scripts are numbered consecutively (starting 000001 ..., 000002 ..., etc.) and the deployment script I run simply first backs up the existing database, then gets the last run database script from the schema version table, and then runs any new database scripts found in source control in correct order, updating the schema version table accordingly.

This approach allows me to rebuild database from scratch pretty quickly.

The two approaches taken together make it possible to quickly deploy your code base to several different staging machines, your QA environment, beta, etc.


For just a little bit more complex scenarios, you should run a continues integration build server, like Kieveli et. al. suggested, which essentially "rebuilds" your entire deployment regularly and therefore contains scripts to do exactly what you would run "manually" above.

Database deployment can also be made more sophisticated by creating a rollback script for each database script. You should then write a little controller app to handle those. There are several OSS solutions for this kind of stuff and one of them may fit your needs.

BUT, make sure you never auto-deploy your database to a production environment ;-)

Thomas Jung
+2  A: 

Best build tool for a PHP project is probably Phing, which is quite similar to Ant, but is written in PHP. It contains all the necessary things you'd need for something such as this, like grabbing things from your svn repo.

Jani Hartikainen
A: 

Once you've got a one step build going, you can easily turn it into continuous builds.

We've got all of our completed builds, marked with the change number they were built from, on a central server. When something gets committed (we use Perforce, but this would work for SVN), a cronjob on one of our build boxes notices that there's a more recent change than is build, fires of an http request to download the source tree, and starts building (with GMake, mostly). Continuous builds in only a few easy steps :)

After that, it's a short step to having all of your test automation run automatically. Fully built and tested (possibly deployable!) code after each commit.

Chris Simmons
A: 

For scripting language, usual advices like using ant-variant or CruiseControl variant don't mean much because you don't need to compile anything.

Let's stick to database. Three important things when it comes to continuous integration is automate, automate, and automate. This means you have to have everything from building empty database, importing from external data, and upgrading to new version scripted and ready to run using some scripts. A good example of that may be something like MediaWiki, which let's you configure and install using php itself. I would recommend running build server to deploy fresh database during the day, run unit tests and send out emails if any fail.

eed3si9n
A: 

The way I think of it is that you want one script to pull everything together, basically get all the files\resources from source control an the perform all the steps to create a final 'product'

Off the top of my head those step could include get latest, compile, get any other files need to complete the product, create installer (if needed), run unit tests, share output on server (whatever that might mean for a particular project), and inform users that a new version has been create (or tell them if one has not, and why). And whatever else you might need to do.

In the past I've usually started with some sort of batch file, then created some sort customized builder exe. But maintaining that always became a pain. Eventually I move one to 3rd party applications...now I just use one of the two products below.

http://www.kinook.com/VBP/

http://www.finalbuilder.com/

the empirical programmer