views:

67

answers:

2

We will shortly be spinning up a new Linux-based development effort; and we are new-ish to Linux development on a large scale. We have a well-defined process to handle third-party and internal libraries on Windows, but I am interested in Linux-specific variations and optimizations.

It seems logical to (as on Windows)

  1. Not have everyone download and compile their own version of various libraries like boost/etc. but have official builds checked-in to a repository somewhere.
  2. To further have the above libraries binaries checked in to source control so different versions can be tied to our code, so for example version 1.2 of our code needs boost 1.41 and the whole thing just works.

Should we therefore use something like apt-get packages to a local server with official versions of things?

If so, do we allow the libraries to install to their "natural" place of /usr/local/include or wherever?

If not; do we just check-in libraries into our normal repository places, then have our Makefile(s) have paths (relative or otherwise) embedded in them so as to not depend upon /usr/local/?

What have other people done in this area please? Pointers to books/sites also greatly appreciated for Linux-based team development (of applications, not Kernel).

+1  A: 

AFAIS use normal placeholders for libraries you use! Users will do.

But check in all libraries and config scripts and IDE's and compilers and editors and code generators you use (and .deb if available) so when new computer arrives installing whole dev environment means just getting things form repo in one command!! And also getting some old code do not require installing old libraries! :D (extremely useful for binary search)

Consider two ways of releasing your app with and without libraries attached. So even if someone can not get proper versions everything works fine. (eg Skype, Tlen)

And add scripts that will make inhalators for you automatically to your build system. Just to have ability to make it with one command.

EDIT (replay to first comment below):

Do not know what in-house mean ?!? You work at home ??

If possible use locations that will be used on clients machines :)

Putting all environment in repo also mean that any change to it will be recorded and if need for getting back will arise it will mean only typing one command.

Any changes in dev environment mean possibly introducing new bugs so easy way to track them (binary search) without reinstalling environment will be only big plus. And if you know that will use different versions of libraries binary search would require to install different of them manually :(

Oh and one tip: check if your linux distro's packaging system provide easy way of extracting installed files form a package just to automate process of adding newly installed parts of dev env. to repo (eg. one script that will receive package and will produce any required files into VC repo :) and that script should also be puted into VC repo :))

If hard to maintain spreed location of libraries, put them all in one folder and simply update that folder every time it change.

przemo_li
"normal placeholders" ? Do you mean their natural location like /usr/local/include or whatever ? How are you handling parallel versions then: for the build machines anyways ?So you recommend to check-in eclipse/gcc/etc.? Or their packages ?We are developing for in-house use only BTW.
sdg
Hi/thanks for the help so far."In-house" meaning that we are the only people using the software, it is not for "release" to a wider audience.
sdg
You as you or others from your company ? If you that is good. If others maybe try to get one or two assigned to your team :) (shorter time between asking question to client to response the faster speed of development)Read a bit about XP if interested :) or Agile development if XP sound to wired.And don't thank just click vote up :D !!
przemo_li
A: 

For libraries generally available from linux distributions I'd recommend getting them from there. I'd just build a meta-package depending on all the libraries needed for development for a start. For Debian-based distros this can be easily done with equivs. You can also specify needed versions there. If you need versions your distribution of choice does not provide you can still host them in your own apt repository. It should be obvious that it's a good idea to keep the sources and build scripts to all custom built packages in a VCS.

As for internally developed libraries and libraries, you felt the need to modify in-house, it kind of depends on the situation. I would not just stick everything into your applications repository. Instead use tools like jhbuild to automate the process of building needed library's repositories.

Building deb and/or rpm packages for those libraries might still be a good idea. "Continuous Integration" is the buzzword that springs to mind.

If you need to modify open source libraries you use, I'd advise on sending patches upstream where it makes sense. This might be additional work to do, but this is not only about playing nice with the OSS community, but might easily save you work and grieve in the long run.

Hope that helps.

davrieb