tags:

views:

1374

answers:

4

Ever since I learned about -j I've used -j8 blithely. The other day I was compiling an atlas installation and the make failed. Eventually I tracked it down to things being made out of order - and it worked fine once I went back to singlethreaded make. This makes me nervous. What sort of conditions do I need to watch for when writing my own make files to avoid doing something unexpected with make -j?

+6  A: 

I think make -j will respect the dependencies you specify in your Makefile; i.e. if you specify that objA depends on objB and objC, then make won't start working on objA until objB and objC are complete.

Most likely your Makefile isn't specifying the necessary order of operations strictly enough, and it's just luck that it happens to work for you in the single-threaded case.

Jeremy Friesner
That's correct. I work on a code base of about 20 million lines, mostly in C with a little C++. It's split into hundreds of components, about half of which use make, and half of which use jam. I always do parallel compiles with the -j option; otherwise, builds would take hours. Jam generates its own dependencies, so the components that use it always succeed. But components that use hand-built makefiles choke on occasion, invariably due to inadequate dependencies.
Bob Murphy
+5  A: 

In short - make sure that your dependencies are correct and complete.

If you are using a single threaded make then you can be blindly ignoring implicit dependencies between targets. When using parallel make you can't rely on the implicit dependencies. They should all be made explicit. This is probably the most common trap. Particularly if using .phony targets as dependencies.

This link is a good primer on some of the issues with parallel make.

Andrew Edgecombe
+1 For the good link. I now feel like i can trust make -j too and how to fix problems when they arise. Well worth reading.
Robert Massaioli
+4  A: 

If you have a recursive make, things can break pretty easily. If you're not doing a recursive make, then as long as your dependencies are correct and complete, you shouldn't run into any problems (save for a bug in make). See Recursive Make Considered Harmful for a much more thorough description of the problems with recursive make.

Adam Rosenfield
+2  A: 

Here's an example of a problem that I ran into when I started using parallel builds. I have a target called "fresh" that I use to rebuild the target from scratch (a "fresh" build). In the past, I coded the "fresh" target by simply indicating "clean" and then "build" as dependencies.

build: ## builds the default target
clean: ## removes generated files
fresh: clean build ## works for -j1 but fails for -j2

That worked fine until I started using parallel builds, but with parallel builds, it attempts to do both "clean" and "build" simultaneously. So I changed the definition of "fresh" as follows in order to guarantee the correct order of operations.

fresh:
    $(MAKE) clean
    $(MAKE) build

This is fundamentally just a matter of specifying dependencies correctly. The trick is that parallel builds are more strict about this than are single-threaded builds. My example demonstrates that a list of dependencies for given target does not necessarily indicate the order of execution.

nobar