views:

296

answers:

3

We're using Hudson on Windows to build a .NET solution and run the unit tests (NUnit). Hudson is thereby used to start batch files that do the actual work.

I am now trying to set up a new test that is to run on a build slave and will run very long. The test should use the binaries produced by the upstream build.

I have searched the Hudson documentation but I cannot find how to pass upstream build artifacts to downstream slaves. How do I do this?

+1  A: 

Depending on the source control management you are using, you could cheat and use that. i am not a fan of checking in binaries, especially if they are large. But I have in the past taken generated binaries or installers generated via a CI build, automated the check in of them into a separate svn repository and had the slave machine pull from that repo when dictated by the master and perform whatever tests you are needed to execute.

Rob Goodwin
That would work, but an SVN repository that keeps binaries nobody needs would truly be a waste of resources. I thought hudson came with the ability out of the box, but I'm willing to fiddle with an extension, if that's needed.
sbi
I would agree - if they are not needed, then it would be a waste of resources. We kept those resources so if for some reason a test failed, we had the installation files to send to test to recreate the problem. Granted, you should be able to rebuild from the repo revision that the build came from, but we found it easier. The automated build could put in the comments of the checkin the rev of the repo that the binaries where generated from
Rob Goodwin
+5  A: 

Use the Copy Artifact plugin in your downstream build.

Just specify the name of the upstream job and the paths to copy into your downstream workspace.

Christopher
This seems indeed what I need. I have just installed it and in the downstream project I can now add a build step to copy artifacts. However, I'm not sure how to use this. Let's say I need a folder from the upstream project's workspace. What do I need to enter? "path/to/folder" didn't seem to work.
sbi
I don't believe you can just specify a directory; the plugin works on files. But the pattern `path/to/folder/**` should do the trick (i.e. copy all files found in this directory and its subdirectories).
Christopher
@Christopher: Ok, so I have the build script for the upstream build zip up that folder now and hudson collect the zip as artifact. The downstream test is configured to copy that artifact. However, I always get a "Copied 0 artifacts from <project>" message. Any idea what could be going wrong?
sbi
Hmm. Are you sure you're capturing the zip file as an artifact in the upstream project? Otherwise, I'd just ensure that the path or pattern you're specifying in the plugin exactly matches the path to the zip file. You can use `http://hudson/job/project-name/lastSuccessfulBuild/artifact/` to check the exact path where the zip is being saved.
Christopher
@Christopher: Ok, with a bit more fuddling I finally got it working. Thank you very much for your help!
sbi
A: 

It might be a little bit overkill for you, if you are only dependent on the binaries. But there is also the Clone Workspace SCM Plugin, which archives your whole workspace and you could check it out with the next job as if it would come from an SCM. It is pretty new.

We currently use a different setup. We have an Artifact Repository, where we push our binaries. The second job pulls the binaries from that repository. Physically it is just a standard Windows share, where we create a subfolder with thw job build number in it. If you also use the [Parameterized Trigger Plugin][2] you can pass the build number from job 1 to job 2 and run your test on the right binary. The side effect is, that you can reuse the binaries later without keeping a long history with in Hudson.

[2]: http://Parameterized Trigger Plugin

Peter Schuetze