We will shortly be spinning up a new Linux-based development effort; and we are new-ish to Linux development on a large scale. We have a well-defined process to handle third-party and internal libraries on Windows, but I am interested in Linux-specific variations and optimizations.
It seems logical to (as on Windows)
- Not have everyone download and compile their own version of various libraries like boost/etc. but have official builds checked-in to a repository somewhere.
- To further have the above libraries binaries checked in to source control so different versions can be tied to our code, so for example version 1.2 of our code needs boost 1.41 and the whole thing just works.
Should we therefore use something like apt-get packages to a local server with official versions of things?
If so, do we allow the libraries to install to their "natural" place of /usr/local/include or wherever?
If not; do we just check-in libraries into our normal repository places, then have our Makefile(s) have paths (relative or otherwise) embedded in them so as to not depend upon /usr/local/?
What have other people done in this area please? Pointers to books/sites also greatly appreciated for Linux-based team development (of applications, not Kernel).