views:

173

answers:

4

I write tools that are used in a shared workspace. Since there are multiple OS's working in this space, we generally use Python and standardize the version that is installed across machines. However, if I wanted to write some things in C, I was wondering if maybe I could have the application wrapped in a Python script, that detected the operating system and fired off the correct version of the C application. Each platform has GCC available and uses the same shell.

One idea was to have the C compiled to the users local ~/bin, with timestamp comparison with C code so it is not compiled each run, but only when code is updated. Another was to just compile it for each platform, and have the wrapper script select the proper executable.

Is there an accepted/stable process for this? Are there any catches? Are there alternatives (assuming the absolute need to use native C code)?

Clarification: Multiple OS's are involved that do not share ABI. Eg. OS X, various Linuxes, BSD etc. I need to be able to update the code in place in shared folders and have the new code working more or less instantaneously. Distributing binary or source packages is less than ideal.

A: 

You know, you should look at static linking.

These days, we all have HUGE hard drives, and a few extra megabytes (for carrying around libc and what not) is really not that big a deal anymore.

You could also try running your applications in chroot() jails and distributing those.

dicroce
A: 

Depending on your mix os OSes, you might be better off creating packages for each class of system.

Alternatively, if they all share the same ABI and hardware architecture, you could also compile static binaries.

Jordi Bunster
+1  A: 

Also, you could use autoconf and distribute your application in source form only. :)

dicroce
+1  A: 

Launching a Python interpreter instance just to select the right binary to run would be much heavier than you need. I'd distribute a shell .rc file which provides aliases.

In /shared/bin, you put the various binaries: /shared/bin/toolname-mac, /shared/bin/toolname-debian-x86, /shared/bin/toolname-netbsd-dreamcast, etc. Then, in the common shared shell .rc file, you put the logic to set the aliases according to platform, so that on OSX, it gets alias toolname=/shared/bin/toolname-mac, and so forth.

This won't work as well if you're adding new tools all the time, because the users will need to reload the aliases.

I wouldn't recommend distributing tools this way, though. Testing and qualifying new builds of the tools should be taking up enough time and effort that the extra time required to distribute the tools to the users is trivial. You seem to be optimizing to reduce the distribution time. Replacing tools that quickly in a live environment is all too likely to result in lengthy and confusing downtime if anything goes wrong in writing and building the tools--especially when subtle cross-platform issues creep in.

tuxedo
The python interpreter is not heavy, at least not in a *nix environment that is already using it extensively. Start up time is negligible, maybe 50 ms.
postfuturist