views:

142

answers:

6

I have NUnit installed on my machine in "C:\Program Files\NUnit 2.4.8\" but on my integration server(running CruiseControl.Net) I have it installed in "D:\Program Files\NUnit 2.4.8\". The problem is that on my development machine my NAnt build file works correctly because in the task I'm using the path "C:\Program Files\NUnit 2.4.8\bin\NUnit.Framework.dll" to add reference to the 'NUnit.Framework.dll' assembly but this same build file cannot build the file on my integration server(because the reference path is different). Do I have to have my NUnit installed at the same location as it is in my integration server? This solution seems too restrictive to me. Are there any better ones? What is the general solution to this kind of problem?

+3  A: 

Typically I distribute NUnit and any other dependencies with my project, in some common location (for me that's a libs directory in the top level).

/MyApp
  /libs
    /NUnit
    /NAnt
    /etc...
  /src
    /etc...

I then just reference those libs from my application, and they're always in the same location relative to the project solution.

James Gregory
This is precisely how I do it, except that my "libs" directory is called "lib" (in keeping with the old unix standard). This has the additional benefit of easing developer transition - any developer can pick up your project and start working on it, without having to install a ton of dependencies.
Chris
Actually, I just looked and mine's called "lib" too, Unix must be where I picked it up from.Portability is an excellent reason to do this. I can do a checkout of a project and just build it off the bat without any additional installs needed.
James Gregory
A: 

I'd use two approaches:

1) use two different staging scripts (dev build/integration build) with different paths.

2) put all needed executables in you path folder and call them directly.

+1  A: 

In general, dependencies on absolute paths should be avoided. As far as CI goes, you should be able to build and run your solution on a clean machine completely from scatch using only resources found in your source code control via automated scripts.

Jim Anderson
+1  A: 

The "ultimate" solution can be to have the entire tool-chain stored in your source-control, and to store any libraries/binaries you build in source-control as well. Set up correctly, this can ensure you have the ability to rebuild any release, from any point in time, exactly as it was shipped, but that, furthermore, you don't need to do that as every binary you#ve ever generated is source-controlled.

However, getting to that point is some serious work.

RB
A: 

I'd agree that absolute paths are evil. If you can't get around them, you can at least set an NUNIT_HOME property within your script that defaults to C:... and in your CI server call your script passing in the NUNIT_HOME property at the command line.

Or you can set your script to require an NUNIT_HOME environment variable to be set in order for NUNIT to work. Now, instead of requiring that the machine it runs on has nUnit in some exact location, your script requires that nunit be present and available in the environment variable.

Either approach would allow you to change the version of nunit you are using without modifying the build script, is that what you want?

EricMinick
A: 

The idea of having all the tools in the tool chain under version control is a good one. But while on your path there you can use a couple of different techniques to specify different paths per machine.

NAnt let's you define a <property> that you can override with -Dname=value. You could use this to have a default location for your development machines that you override in your CI system.

You can also get values of environment variables using environment::get-variable to change the location per machine.

Jeffrey Fredrick