views:

183

answers:

3

I'm not talking about making portable code. This is more a question of distribution. I have a medium-sized project. It has several dependencies on common libraries (eg openssl, zlib, etc). It compiles fine on my machine and now it's time to give it to the world.

Essentially build engineering at its finest. I want to make installers for Windows, Linux, MacOSX, etc. I want to make a downloadable tar ball that will make the code work with a ./configure and a make (probably via autoconf). It would be icing on the cake to have a make option that would build the installers..maybe even cross-compile so a Windows installer could be built in Linux.

What is the best strategy? Where can I expect to spend the most time? Should the prime focus be autoconf or are there other tools that can help?

+2  A: 

The product that I work on is not too different from this. We use an autoconf-based build system, and it works pretty well.

The place that you'll spend the most time, by far, is supporting users. User systems will have all sorts of wrinkles that you don't expect until they run into them, and you'll need to add more configure options to support them. Over time, we've added options to set the include and lib paths for every library we depend on; we've added options to change compile flags to work around various weird glitches in various versions of those libraries (or API changes from one version to another than need changes in our code), we've added workarounds for the fact that some BLAS libraries use a C interface and some use a Fortran interface so even though they're theoretically implementations of the same library they do a few things slightly differently, and so on. You can't anticipate all this in advance, and it also needs documenting so that users can figure out which options to set.

Oh, and installers are really a pain, because they generally are OS-dependent (unless it's just a shell script and you require CygWin), and the locations to install to tend to be OS-dependent, and so forth. That's another area that will take up time -- either in building a good installer, or in supporting users in manually setting things up.

Setting up cross-compile is, in my experience, well worth the trouble (at least for the Linux-to-Windows case; not sure about MacOS/X) -- much easier than trying to keep multiple different build systems in sync.

As an alternate perspective, there's the option that the OpenFOAM project uses for their rather large C++ library, which is to distribute it along with an "approved" G++ compiler and packages for all the other components, so that they don't have to worry about different compilers and so forth. But that really only works on one OS. I guess the Windows/MacOSX version of that is to provide pre-set-up VMWare images. In some cases, there's something to be said for that....

Brooks Moses
A: 

Use autotools, most users are familiar with them (i.e. they know to run ./configure && make && make install)

  • Create .tar.gz with make dist, test it make sure it compiles and works on more then one system.
  • For Linux/Unix/Cygwin, do not provide installers... source tar.gz is more then fine. In any case each Linux distribution has its own packaging rules and most of them know to to use autoconf builds, users may have 32 or 64 bit systems or even run over PPC or Sparc - so don't bother.

    Maybe it worth to create one deb or rpm for most popular systems but not more then that..

  • For Windows (native, not cygwin) provide binaries. Installation of Migw+auto is quite painful and windows users are generally more "next->next->next" users then "wget/tar/configure/make/make-install" users" Provide zip or some installer there are some FOSS installers out there.

    Remember poor Windows users by default do not have zlib or openssl... So you'll need to ship them with your package.

About CMake...

If you targeting mostly Windows platform or you are willing to support MSVC then probably you should consider it. Otherwise, autotools provide good distribution and build alternative.

Artyom
+3  A: 

I would recommend CMake. Advantages:

  • It is very easy to use for building simple and complex projects with static libraries, dynamic libraries, executables and their dependencies.
  • It is platform independent and generates makefiles and/or ide project files for most compilers and IDEs.
  • It abstracts the differences between windows and unix, eg "libShared.so" and "Shared.dll" are referred to as "Shared" (cmake handles the name differences for each platform), if Shared is part of your project it sorts out the dependency if not it assumes that it is in the linker path.
  • It investigates the users system for compiler and 3rd party libraries that are required, you can then optionally remove components when 3rd party libraries are not available or display an error message (It ships with macros to find most common 3rd party libraries).
  • It can be run from the command line or with a simple gui that enables the user to change any of the parameters that were discovered above (eg compiler or version of 3rd party library).
  • It supports macros for automating common steps.
  • There is a component called CPack that enables you to create an installer, I think this is just a make install command line thing (I have not used it).
  • The CTest component integrates with other unit testing libraries like boost test or google test.

I use CMake for everything now, even simple test projects with visual studio.

I have never used autotools but a lot of other users have commented that cmake is easier to use. The KDE project moved to cmake from autotools for this reason.

iain
+1. Many large OS libs/apps use CMake, so it seems to be pretty useful. Heckalot better than autoconf/automake if you want to relase on Windows too.
Marcus Lindblom