views:

896

answers:

8

Every time a new developer joins to the team, or the computer a developer using changes, developer needs to do lots of work to setup the local development environmentto make the current project working. As a SCRUM team we are trying to automate everything including deployment and tests so what I am asking is: is there a tool or a practice to make local development environment setup automated?

For example to setup my environment, first I had to install eclipse, then SVN, Apache, Tomcat, MySQL, PHP. After theese I populated DB, I had to do minor changes in the various configuration files etc... Is there a way to reduce this labor to one-click?

A: 

If you use machines in a standard configuration, you can image the disk with a fresh perfectly configured install -- that's a very popular approach in many corporations (and not just for developers, either). If you need separately configured OS's, you can tar-bz2 all the added and changed files once a configured OS is turned into your desired setup, and just untar it as root to make your desired environment from scratch.

Alex Martelli
+1  A: 

There's always the option of using virtual machines (see e.g. VMWare Player). Create one environment and copy it over for each new employee with minimal configuration needed.

tehvan
+6  A: 

I like to use Virtual PC or VMware to virtualize the development environment. This provides a standard "dev environment" that could be shared among developers. You don't have to worry about software that the user could add to their system that may conflict with your development environment. It also provides me a way to work to two projects where the development environments can't both be on one system (using two different versions of a core technology).

jeremyasnyder
Does running in a VirtualPc/VMWare environment affects the performance by much? Do you use Visual Studio 2008?
Joel Gauvreau
I know of at least one big software dev shop that do ALL of their development exclusively in VMs (VMWare).Things are much better now that CPUs have hardware virtualisation support...
tomfanning
You'd have to be careful of Computer name conflicts. Windows isn't too happy when two machines with the same name appear on a single network.I also prefer not to have my dev environment in a VM. The overhead can really drag down my performance.
Kieveli
We use a completely virtualised Dev environment. We use VMServer or VirtualBox to host the Vista VMs. The VM image contains our whole development stack but is not attached to the corp domain. When a user needs a new machine, they copy it locally, run NewSID and then join the domain with a unique machine name.On modern hardware, there is no noticable performance degradation. We use VS2008 with ~20 C++/C# projects in a solution and it all works fine.
Colin Desmond
One great advantage that I see with virtualization is the ability to save the current state as is, so that when you come back to it later you get back exactly where you were. Not exactly related to setting up a new dev machine but a nice side effect of using a VM.
Joel Gauvreau
+8  A: 

One important point is to set up your projects in source control such that you can immediately build, deploy and run after checkout.

That means you should also checkin helper infrastructure, such as Makefiles, ant buildfiles etc., and settings for the tools, such as IDE project files.

That should take care of the setup hassle for individual projects.

For the basic machine setup, you could use a standard image. Another option is to use your platform's tools to automate installation. Under Linux, you could create a meta-package that depends on all the packages you need. Under Windows, a similar thing should be possible using MSI or the like.

sleske
+3  A: 

Use puppet to configure both your development and production environment. Using a top-notch automation system is the only way to scale your ops.

David Schmitt
A: 
blackkettle
+12  A: 

There are several options, and sometimes a combination of these is useful:

  • automated installation
  • disk imaging
  • virtualization
  • source code control

Details on the various options:

  1. Automated Installation Tools for automating installation and configuration of a workstation's various services, tools and config files:

    • Puppet has a learning curve but is powerful. You define classes of machines (development box, web server, etc.) and it then does what is necessary to install, configure, and keep the box in the proper state. You asked for one-click, but Puppet by default is zero-click, as it checks your machine periodically to make sure it is still configured as desired. It will detect when a file or mode has been changed, and fix the problem. I currently use this to maintain a handful of RedHat Linux boxes, though it's capable of handling thousands. (Does not support Windows as of 2009-05-08).
    • Cfengine is another one. I've seen this used successfully at a shop with 70 engineers using RedHat Linux. Its limitations were part of the reason for Puppet.
    • SmartFrog is another tool for configuring hosts. It does support Windows.
    • Shell scripts. RightScale has examples of how to configure an Amazon EC2 image using shell scripts.
    • Install packages. On a Unix box it's possible to do this entirely with packages, and on Windows msi may be an option. For example, RubyWorks provides you with a full Ruby on Rails stack, all by installing one package that in turn installs other packages via dependencies.
  2. Disk Images Then of course there are also disk imaging tools for storing an image of a configured host such that it can be restored to another host. As with virtualization, this is especially nice for test boxes, since it's easy to restore things to a clean slate. Keeping things continuously up-to-date is still an issue--is it worth making new images just to propagate a configuration file change?

  3. Virtualization is another option, for example making copies of a Xen, VirtualPC, or VMWare image to create new hosts. This is especially useful with test boxes, as no matter what mess a test creates, you can easily restore to a clean, known state. As with disk imaging tools, keeping hosts up-to-date requires more manual steps and vigilance than if an automated install/config tool is used.

  4. Source Code Control Once you've got the necessary tools installed/configured, then doing builds should be a matter of checking out what's needed from a source code repository and building it.

Currently I use a combination of the above to automate the process as follows:

  • Start with a barebones OS install on a VMWare guest
  • Run a shell script to install Puppet and retrieve its configs from source code control
  • Puppet to install tools/components/configs
  • Check out files from source code control to build and deploy our web application
Pete TerMaat
In our office there are windows, linux and mac os. so I will pick the virtualization option.
nimcap
We fly with virtualization and it is a good solution. When I started it tok me 3-4 days to get set up now new devs get set up in the time it take to copy a VPC image.
Burt
+1  A: 

At a prior place we had everything (and I mean EVERYTHING) in SCM (clearcase then SVN). When a new developer can in they installed ClearCase|SVN and sucked down the repository. This also handles the case when you need to update a particular lib/tool as you can just have the dev teams update their environment.

We used two repo's for this so code and tools/config lived in separate places.

Mike Reedell