views:

64

answers:

3

I have a Hudson job that just does a check-out/update to a third-party library. Call this Job A.

Several other jobs depend on this library. Call them Jobs B and C. They use the stuff checked out by Job A, and need it to be up-to-date.

My question is, how can I require Jobs B and C to always run Job A (to update the library) before they run through their build routine?

If this is not possible, can someone recommend another way to achieve the same effect?

+1  A: 

You can do it the other way with "child" jobs. For example, you can configure A to trigger B and C after it has succeeded. (You will find the option on job A configuration page).

If you need more advanced conditions for triggering the child jobs, you can take a look at the Parametrized Trigger plugin.

Julien Nicoulaud
The problem is that this makes the dependency sort of backwards. I don't want building A to cause B and C to build. I want building B or C to cause A to build first.
sjohnston
I think your approach will cause you issues too, because A is built two times instead of one.
Julien Nicoulaud
What you're talking about probably makes sense in a lot of situations. For us, Job A is not actually a "build", just an SVN update, and it will complete very quickly. B and C are real builds that will take much longer. Building both of them every time one needs to be built would be a much bigger slowdown than building A repeatedly.
sjohnston
Why having a job that just does a SVN update ? B and C will do their own SVN updates before building, Hudson handles that very well (this is the default behaviour).
Julien Nicoulaud
A: 

After thinking about the problem some more, I think I may have been over-complicating things.

Since the library in Job A is rarely updated, we decided it's probably acceptable to just scan SVN on an interval and update when there are changes. There's a small possibility that builds of B and C will miss library changes if they start right after the changes to A were checked in, but that should rarely be an issue.

sjohnston
A: 

If I follow you, it sounds like you might need the Join plugin:

This plugin allows a job to be run after all the immediate downstream jobs have completed. In this way, the execution can branch out and perform many steps in parallel, and then run a final aggregation step just once after all the parallel work is finished. The plugin is useful for creating a 'diamond' shape project dependency. This means there is a single parent job that starts several downstream jobs. Once those jobs are finished, a single aggregation job runs

William Leara