views:

180

answers:

5

What do modern companies use to compile and link their projects?

Especially with large projects, makefiles don't seem usable enough to be scalable. As far as I can tell, many companies use either in-house build systems or in-house scripts on top of existing build systems.

Do large projects use make? Is pure Ant, Maven, etc used or is it wrapped with some script?

+2  A: 

We used intel compilers with custom SCons scripts written in python at my last company. Development was in C++. The products we released were huge visual software packages.

SCons: "SCons is an Open Source software construction tool—that is, a next-generation build tool. Think of SCons as an improved, cross-platform substitute for the classic Make utility with integrated functionality similar to autoconf/automake and compiler caches such as ccache. In short, SCons is an easier, more reliable and faster way to build software."

Martin
What do you mean by "custom SCons scripts?"
Alex Reece
As you would have a custom build file for your code, so would you have a custom SCons script for the same purpose.
Martin
A: 

make and makefile is as natural to C as is ant to java

Peter Miehle
How do you write a makefile for a large project? A scalable, usable, makefile? The closest I've seen http://www.xs4all.nl/~evbergen/nonrecursive-make.html and that still has problems.
Alex Reece
@Alex: I've worked on a large project (an OS, with accompanying JVM) which used make. It was recursive at the very top level, then each component decided ad hoc whether it was big enough on its own that recursive make would be problematic, or not. So some components built recursively, others built in their own root. The JVM was the slowest part, not least because it built multiple versions side by side. It used some fairly hairy makefiles with a script wrapping `make` itself. I don't remember the details, but it was easy enough to add a new directory, and to build any sub-tree.
Steve Jessop
Oh, and all the common Java library classes were optimised using that OS's equivalent of JNI, which is why the JVM was so big and also why they were built using `make` and the assembler rather than `javac`.
Steve Jessop
@Steve: Wow, that sounds really hairy. Did your makefiles handle cross-subtree dependencies? Not to mention it still involves using scripts wrapping make. It seems like the industry standard way to do build a large project is to write your own script to wrap whatever build system you use :-/
Alex Reece
@Alex: didn't really need to - the OS didn't use much static linking: dynamically-loaded "tools" (fast), and optionally a full statically-linked OS image (maybe a minute). So there were lots of cross-tree header dependencies, easily handled by make, but very rarely cross-tree dependencies on things that needed building. The major components were well isolated behind stable interfaces. I guess that makes the method inapplicable to most C projects. The wrapper wasn't fundamentally changing make, but it did run it several times with different environments, to build for different versions of Java.
Steve Jessop
That's a minute to build the image once everything was compiled, I should say. A full "build the whole OS from source control" took an hour or more, plus however long your job spent waiting in the queue to be assigned to a build machine... So the standard development/debug cycle was: modify some code; run make either in the relevant directory or for the relevant component; rebuild the image you're using if required; restart the image. If possible we'd remove the code we were working on from the imge used for debug, so that it would be dynamically loaded, and hence nothing depends on it.
Steve Jessop
@Steve: So one thing that concerns me about make (that I haven't been able to find an effective solution to yet) is that, while make is good at only rebuilding the newest code, it doesn't appear to have a solution for figuring out the appropriate dependencies for linking http://stackoverflow.com/questions/3802619/
Alex Reece
@Alex Reece: agreed, although I think that's a general issue with C. Unlike `javac`, C compilers don't resolve external links automatically, and couldn't completely do so even if they tried, since a function with any name could appear in any object file(s). Someone has to *decide* which object file to look in for function `foo`, it can't be worked out. `make` doesn't do decisions ;-) What you usually do, is manually list the `.o` files as dependencies of the executable, then use the automatic variables `$^` or `$+`.
Steve Jessop
@Steve Jessop: but it could. given the appropriate conventions (namely, that there is a 1-1 correspondence between `.cpp` files and `.h` files), `gcc -MM` probably could be used to automatically generate the dependencies for C and C++ files.
Alex Reece
@Alex: sure, you could do that yourself, but make isn't going to assume any such convention, and so *it* has no solution. It's pretty common to have .h files for which no .o file exists (perhaps containing macro definitions and static inline functions), and/or to have more .c files (one per function) than .h files (one per area of functionality).
Steve Jessop
@Steve: important details, true. But - esp if we enforced the constraint that there were no more than one .c file for any .h file - it seems it would still be possible to write a simpler build system that generated all the dependencies for compiling and linking for the programmer. But if that were the case, why isn't there one out there?
Alex Reece
A: 

Some big projects (e.g., Boost) eschew make in favor of one of the Jams.

Matt Kane
+3  A: 

A lot use CMake these days, which auto-generates makefiles for various platforms. Unfortunately CMake has its own (weird) internal language, which is why I personally prefer SCons - anything it can't do naturally can be easily added - after all it's just Python code. Take a look at the list of OS projects using SCons. Many of them are quite large and non-trivial multi-platform builds.

Eli Bendersky
+1  A: 

Every product I've worked on in 10+ years at telecom OEMs has used make. Some were relatively small, others were well over 1M SLOC. Most of the source has been c, with a significant amount of c++. Most used 3rd party sources, and all the vendors ship makefiles with their products.

Remember that the system you use to build your product is software. Whether that build software is written in make, SCons, or some other language/system, you must understand the language that you're using to write the software that builds your product. Fail to do this, and you risk introducing bugs in your product caused by an incorrect build system.

bstpierre
Was it raw make or was there some wrapper script used to build a correct makefile?
Alex Reece
It was all raw make.
bstpierre
I should say *nearly* every product I've worked on... there were a couple that used DOS batch files, but I keep trying to purge that unholy mess from my memory.
bstpierre