views:

509

answers:

6

Hi,

I'm in the middle of setting up an build environment for a c++ game project. Our main requirement is the ability to build not just our game code, but also its dependencies (Ogre3D, Cegui, boost, etc.). Furthermore we would like to be able build on Linux as well as on Windows as our development team consists of members using different operating systems.

Ogre3D uses CMake as its build tool. This is why we based our project on CMake too so far. We can compile perfectly fine once all dependencies are set up manually on each team members system as CMake is able to find the libraries.

The Question is if there is an feasible way to get the dependencies set up automatically. As a Java developer I know of Maven, but what tools do exist in the world of c++?


Update: Thanks for the nice answers and links. Over the next few days I will be trying out some of the tools to see what meets our requirements, starting with CMake. I've indeed had my share with autotools so far and as much as I like the documentation (the autobook is a very good read), I fear autotools are not meant to be used on Windows natively.

Some of you suggested to let some IDE handle the dependency management. We consist of individuals using all possible technologies to code from pure Vim to fully blown Eclipse CDT or Visual Studio. This is where CMake allows use some flexibility with its ability to generate native project files.

+1  A: 

On many *nix systems, some kind of package manager or build system is used for this. The most common one for source stuff is GNU Autotools, which I've heard is a source of extreme grief. However, with a few scripts and an online depository for your deps you can set up something similar like so:

  • In your project Makefile, create a target (optionally with subtargets) that covers your dependencies.
  • Within the target for each dependency, first check to see if the dep source is in the project (on *nix you can use touch for this, but you could be more thorough)
  • If the dep is not there, you can use curl, etc to download the dep
  • In all cases, have the dep targets make a recursive make call (make; make install; make clean; etc) to the Makefile (or other configure script/build file) of the dependency. If the dep is already built and installed, make will return fairly promptly.

There are going to be lots of corner cases that will cause this to break though, depending on the installers for each dep (perhaps the installer is interactive?), but this approach should cover the general idea.

Dana the Sane
+1  A: 

There are several interesting make replacements that automatically track implicit dependencies (from header files), are cross-platform and can cope with generated files (e.g. shader definitions). Two examples I used to work with are SCons and Jam/BJam.

I don't know of a cross-platform way of getting *make to automatically track dependencies. The best you can do is use some script that scans source files (or has C++ compiler do that) and finds #includes (conditional compilation makes this tricky) and generates part of makefile. But you'd need to call this script whenever something might have changed.

Tomek Szpakowicz
+3  A: 

I have been using GNU Autotools (Autoconf, Automake, Libtool) for the past couple of months in several projects that I have been involved in and I think it works beautifully. Truth be told it does take a little bit to get used to the syntax, but I have used it successfully on a project that requires the distribution of python scripts, C libraries, and a C++ application. I'll give you some links that helped me out when I first asked a similar question on here.

Some more links:

  1. http://www.lrde.epita.fr/~adl/autotools.html
  2. http://www.developingprogrammers.com/index.php/2006/01/05/autotools-tutorial/
  3. http://sources.redhat.com/autobook/

One thing that I am not certain on is any type of Windows wrapper for GNU Autotools. I know you are able to use it inside of Cygwin, but as for actually distributing files and dependencies on Windows platforms you are probably better off using a Windows MSI installer (or something that can package your project inside of Visual Studio).

If you want to distribute dependencies you can set them up under a different subdirectory, for example, libzip, with a specific Makefile.am entry which will build that library. When you perform a make install the library will be installed to the lib folder that the configure script determined it should use.

Good luck!

John Bellone
I wonder if the Eclipse CDT (or some other IDE) handles the cross-platform aspect.
Dana the Sane
I think your best bet for Windows deployment is using the native Visual Studio tools, building a project solution, creating an MSI installer package and placing these files inside a separate project directory inside of the folder.
John Bellone
I really feel like you are all missing the question (or I am). The title is asking about setting up dependencies for building. This is not about deploying binaries, using autotools to build, or anything like that. It specifically says "We can compile perfectly fine once all dependencies are set up manually on each team members system as CMake is able to find the libraries".
Adam W
You're right Adam. My question was mainly targeted at the automation of build environment setup. Nonetheless the other answers dealing with the deployment of the binaries are valid too, because the two concerns seem to be very related to each other.
itti
I briefly covered dependencies in my answer above. With Autoconf you are able to _check_ for existing dependencies during the configure stage of the script. So if the wrong version of Python, Ruby, etc is installed you can warn the user.With that said you can have sub directories with sub projects that will handle actual distribution with you perform a "make install." Distributing shared libraries is also possible.
John Bellone
+2  A: 

The Question is if there is an feasible way to get the dependencies set up automatically.

What do you mean set up?

As you said, CMake will compile everything once the dependencies are on the machines. Are you just looking for a way to package up the dependency source? Once all the source is there, CMake and a build tool (gcc, nmake, MSVS, etc.) is all you need.

Edit: Side note, CMake has the file command which can be used to download files if they are needed: file(DOWNLOAD url file [TIMEOUT timeout] [STATUS status] [LOG log])

Edit 2: CPack is another tool by the CMake guys that can be used to package up files and such for distribution on various platforms. It can create NSIS for Windows and .deb or .tgz files for *nix.

Adam W
NSIS is an installation system which can build Windows installers on non-Windows platforms. It can't build installers for non-Windows systems, though.
jamessan
@jamessan: Ah, right, my mistake. Added another edit for CPack which is what I meant to talk about.
Adam W
Thanks for pointing out CMake's file command. CMake's online documentation seems to leave a lot to be desired.
itti
@itti: That actually came right from the online documentation: http://www.cmake.org/cmake/help/cmake-2-8-docs.html#command:file Take a look around the KDE source, they use CMake and can be a good source of examples.
Adam W
+2  A: 

At my place of work (we build embedded systems for power protection) we used CMake to solve the problem. Our setup allows cmake to be run from various locations.

/
CMakeLists.txt "install precompiled dependencies and build project"
   project/
      CMakeLists.txt "build the project managing dependencies of subsystems"
      subsystem1/
         CMakeLists.txt "build subsystem 1 assume dependecies are already met"
      subsystem2/
         CMakeLists.txt "build subsystem 2 assume dependecies are already met"

The trick is to make sure that each CMakeLists.txt file can be called in isolation but that the top level file can still build everything correctly. Technically we don't need the sub CMakeLists.txt files but it makes the developers happy. It would be an absolute pain if we all had to edit one monolithic build file at the root of the project.

I did not set up the system (I helped but it is not my baby). The author said that the boost cmake build system had some really good stuff in it, that help him get the whole thing building smoothly.

caspin
I always thought boost is still using their own build tool "BJam" but if they did a switch to CMake, as your answer suggests, then I will take look at their solution too, thanks!Recursive Makefiles in indvividual subfolders seem to be the way to go - be it CMake, Autotools or something else. Thanks for the Suggestion.
itti
@itti, boost has been considering a CMake build system. One of the boost developers is maintaining an up to date CMake build, you may have to download that to get it to work fully.
caspin
@itti, I would only suggest recursive CMake files of a few levels. It definitely makes things harder to understand if you follow the old autotools approach with one build file in every folder.
caspin
Folks, there is plenty of rumours about that, so please let me to clarify. Boost has not switched to CMake and such motion has not been scheduled, and it is not planned to schedule, for now. The CMake-based build is being contributed mainly by Troy D. Straszheim and it is being considered now as an experiment, a trial of ths alternative, but it's an unofficial works. Perhaps, in future, CMake will be officially accepted to Boost, but for the time being, it is unofficial contribution.
mloskot
+2  A: 

In the latest CMake 2.8 version there is the new ExternalProject module. This allows to download/checkout code, configure and build it as part of your main build tree. It should also allow to set dependencies.

At my work (medical image processing group) we use CMake to build all our own libraries and applications. We have an in-house tool to track all the dependencies between projects (defined in a XML database). Most of the third party libraries (like Boost, Qt, VTK, ITK etc..) are build once for each system we support (MSWin32, MSWin64, Linux32 etc..) and are commited as zip-files in the version control system. CMake will then extract and configure the correct zip file depending on which system the developer is working on.

pkit
@pkit: this is indeed great news. i grinded away at building my own build system using recursive CMakeLists.txt with some python scripts added here and there. Looks like ExternalProject_add() does provide enought functionality out of the box. I will definitely try it out!
itti