views:

349

answers:

2

Our msbuild process creates a variety of zip packages for deployment (mostly web sites, but other things as well). We have a variety of recurring problems that keep sneaking back - files included that shouldn't be, missing resources. This screams for automated validation. The criteria to test for are simple

Validation of foosite package:

  • Resource files are present.
  • No test result files, obj files, or other build artifacts
  • And so forth.

Ideally, I could use nunit or mstest, which everone is familiar with. Msbuild knows where the packages are. We have a lot of packages, possible concurrent builds on different branches. Ergo, the location of the packages and names of the packages are not deterministic - so the tests don't know where the packages are.

What is the simplest way to feed msbuild information to mstest or nunit? The answer to this question would one possible answer, however, that question got architectural advice instead of an answer. I know this isn't a unit test, but the test framework is handy, anyway. I could create an exe to validate the build - but why add a couple hours to the project?

Or, do you have a better suggestion for automatically validating build packages? (MSIs, zips, whatever)?

+1  A: 

What I've ended up doing is having a bunch of custom MS build tasks which spin up a virtual machine on Virtual Server, copy the MSI onto the machine, silently deploy it and then validate against it. I used PSExec to start the MSI. It could then use the MSTest command line runner to use MSTest and run your test bits.

This is probably overkill for you, but using a VM allows you to start clean and not be affected by any previous installs on your dev box.

blowdart
Good idea, will do that as well. However, I'm looking for earlier warning that the package is bad: I want the build to fail, and quickly. We deploy each build, but it's currently human QA that finds it went wrong. In our case, what you describe would be done by our build verification tests (excellent case to add). BVTs requiring deploy of our complex environment are slow - they run daily build and require handholding. Main build needs to be fast and reliable (build, package, unit and other fast tests <= 1 hour). PS: I just discovered PSEXec 2 months ago. It's really fun.
Precipitous
A: 

If you want a fast fail, like a unit test, then I suggest you create unit tests against your packages. Such a test would unzip the .ZIP packages, and run some asserts against the contents.

You could even use some TDD techniques against the packages. For instance, if you have a deployment fail because a particular file is missing, then write a unit test that fails because the file is missing; change the build so that the file is present; then make sure the unit test succeeds.

But in general, deployment issues are broader than that, and I echo the suggestion from blowdart. Deploy into one or more virtual machines, then run automates tests over the deployed environments. These tests would not only test for simple things like was there an error returned during the installation itself; they would also check things like were the IIS virtual directories set up correctly, with the correct properties and contents, and does the web site basically run.

I'd use several different virtual machines to test different deployment scenarios: one for a clean deploy; one for an upgrade from version .-1, etc. It's possible that the same, or similar IVT tests could be run for each environment.

Even if you can't do this all at once, the thought process involved in this exercise should lead to a more formal definition of what your deployment environment really is. You this will be helpful when you get a chance to embody this formal definition in actual tests.

John Saunders