views:

1995

answers:

4

I'm looking into scons and I just want to make sure I know what the alternatives are, before I invest a chunk of brain cells into something completely different. I've been using GNU make in the past but have never been particularly happy with it.

Particularly: why isn't Ant used more often with C / C++ projects? (given that there's ant cpptasks) I read a few posts that say that Ant is more oriented around Java (obviously), but what's the drawback to doing so? And why is scons so much better than make?

I am working with a cross-compiler for TI DSPs, typically there are 20-50 cpp files in a project. It would seem like the hard part in build management is automatic dependency checking. Everything else is just mapping lists of files together with sets of compiler options.

edit: and why does cross-compilation change anything? it's a compiler that runs the same way gcc runs, just that it produces object files / executables that won't run on my PC.

+7  A: 

For cross compiling I think your best choices are either CMake or Autotools. Especially if you can compile your code for multiple architectures/platforms. I typically compile a subset of my code on the native machine for unit testing purposes and all of it for the target platform. CMake handles this especially well, as it lets you specify where the cross compiled libraries live. So rather than searching for the cross compiled libpng in /usr/lib, it can be told to look in /opt/arm-eabi-gcc/ or wherever you have the tool chain libraries installed on your build machine. You can create multiple build directories for the different variants and manually compile each variant with make, or trigger the lot with a braindead hand-rolled recursive make.

Ant has the drawback that it is basically as good or as bad as vanilla Make, with the added disadvantage that you are using something that is not particularly mainstream for C or C++. You have to deal with all your own dependencies - both the internal ones, such as C file to header file to library or executable, and also external dependencies such as having to link with 3rd party libraries. Plus I don't think the Ant C tasks are really maintained that much. Everyone I've seen that uses Ant for C advocates calling out to GCC with exec tasks.

SCons is better, but cross compiling is not its strong point. It is not a "build system" like CMake or Autotools either, it is only a build tool. As it says on their wiki, it is pretty much "Make in Python". It does have built in handling for dependencies though, meaning you don't have to roll your own there with "gcc -MM -MD" or whatever, so that is an advantage over Make. SCons also has support for detecting 3rd party libraries that are installed, but the way it is usually done can add a lot to your build time. Unlike other systems, SCons runs the checking stage every time you run it, though most results are cached. SCons is also infamous for its long build times, though for 50 files that would not be an issue. Cross compilation support in SCons is non-existent - you have to roll your own as discussed on this thread on the mailing list. Typically you force the build to be like a Unix platform, then override the name of the C compiler. Building multiple variants or separating the build directory from the source directory is full of gotchas, which makes it less suitable if you cross and natively-compile your code.

CMake and Autotools have the dependency problems figured out quite well, and autotools' cross compilation support is mature. CMake has had cross compilation since version 2.6.0, which was released in April 2008. You get those features for free, plus others like packaging and running unit tests ("make check" or similar targets). The downside to both of these tools is they require bootstrapping. In the case of CMake, you need to have the CMake binary installed to create the Makefiles or Visual Studio solution files. In the case of Autotools it is slightly more complicated because not everybody who compiles the software would need automake and autoconf installed, only those that need to change the build system (adding new files counts as changing the build system). The 2 stage bootstrapping (configure.ac -> configure, configure + Makefile.in -> Makefile) is conceptually a bit trickier to understand.

For the edit: Cross compiling is an extra headache in build systems for the reason that it adds complexity to the auto-detection of programs and libraries. SCons doesn't deal with this problem, it leaves that up to you to sort out. Ant similarly does nothing. Autoconf handles this in the autotools case, but you may have to provide "--with-libfoobar=/some/path" on the command line when you configure or face broken linking when it tries to use /usr/lib in the link phase. CMake's approach is a little more heavywieght with the toolchain file, but it means you don't have to specify all of you tools and libraries (CC, CXX, RANLIB, --with-ibfoo=, etc) as they are figured out from a standard convention. In theory you can reuse a suitably crafted CMake toolchain file in multiple projects to cross compile them. In practice CMake is not widespread enough to make this convenient for your average hacker, though it may be useful if you are creating multiple proprietary projects.

rq
+1  A: 

Ant has a very Java-heavy user-base (for natural reasons). The fact that Ant can be very useful in a much broader setting is something most non-Java developers are unaware of. As a result, if you use Ant to build your C/C++-code your much more on your own that if you use a system with larger user base (CMake or SCons).

Personally I've been very happy with CMake, primarily because it is the only build system which can generate real Visual Studio projects. SCons can too, but they only make an external call to SCons, whileas CMake generates projects which use Visual Studio's own build engine. This is very useful if you're working together with people who are used to Visual Studio.

I'm not sure I'd call CMake's support for crosscompilation "mature", since it still pretty young. But the CMake people are serious about it, so I wouldn't hesitate to try it out.

JesperE
+2  A: 

I've used Ant+CppTasks heavily a few years back for building very large code bases being integrated into new Java code via JNI, and was very satisfied with the outcome of very reliable incremental builds of C++ code, good flexility (in build files, or via tweaks to the code). BUT, CppTasks is a project with no community and the maintainer (Curt Arnold) hasn't done anything with it for a long time. It remains very good and useful, but be aware that very few people know or use CppTasks (the way compilers "bid" for files bite me for example when I needed a specific compiler for C files, and the different C++ compiler was picking these up instead).

The way CppTasks keeps track of compile options, and dynamically scans sources for header dependencies, all the time yet fast enough (there's some caching of dependencies), meant the only system I've worked with that had accurate incremental builds, something very important when building very large code bases.

So as I've said a few times on the Ant user/dev lists, if you have a lot of Java stuff and already have an investment in Ant, and are not afraid of diving into the doc and the code of CppTasks, and now need to build JNI "glue" code, and/or the "legacy" native libraries the glue code exposes, it's a worthwhile contender, and the rewards can be great.

I easily integrated <rc> for the Windows dll to add complete version info, for example, writing a little Ant task to format the .res file correctly. It worked for me, but Ant+CppTasks is definitely more for the "advanced" Ant user / developer IMHO.

I hope this helps. --DD

ddevienne
A: 

I would like to recommend terp for building C++ projects from ANT. We developed terp because ANT is our chosen build tool and the CppTasks were getting a bit long in the tooth and because we wanted to work at a different level of abstraction. terp supports a lot of different compilers and it's supported by our company. For most project it's not free though.

We designed terp mostly for full rebuilds rather than incremental builds with dependency analysis. We're getting there but it's not there yet. The sweet spot for terp is in multi-platform, multi-procarch, multi-compiler releases where you don't want to maintain n different configurations for all the combinations. At this point (July '09) it's in public beta, but it will be released soon.