views:

241

answers:

2

I am working on a closed source application that should run on Windows, Mac OS X and major Linux distributions (maybe open-source at a later stage).

The program will link against some libraries: boost, Lua, StormLib and zlib. From what I understood there shouldn't be any issues regarding the licenses.
It's a C++ command line application if that matters.

I know there are different ways to do so:

1) This proposes to distribute the software in the native packages. From what I understood this will require me to compile a specific variant for each distribution.
So it will be quite a lot effort to support several distributions. Similar to what the developers of Wine do.

2) Another method I've seen is what the new TeamSpeak3 beta client uses.
It's a selfmake .run package that directly contains the binary files and no source code. Though there are only two different versions of the binary files: 32-bit and 64-bit.

So I wanted to know the most simple way to support as many Linux platforms as possible while not needing to compile 10 different binaries upon every release.
Targetted systems are at least:

  • Windows
  • Mac OS X
  • Debian / Ubuntu
  • Red Hat / OpenSUSE
  • CentOS

Thanks for help!

+2  A: 

cmake should trivialize the build process. You cant get away from supporting multiple package formats yourself -- the only way to do that is to have a popular application, then others will do it for you.

Most of the linux distribution you mentioned shouldn't have any quirks between them. The simplest way to do a lowest common denominator build would be to build a 32-bit and 64-bit variant and place the application (with all of its libraries -- even ones that could be shared) in /opt. This way your dependancies are self-contained. Unix people do not like this way of doing things -- but it works.

For windows cmake can generate a visual studio solution - project build heirarchy. You can write an installer using nsis or visual studio has a wizard for it which covers most simple installs quite well.

Hassan Syed
Thanks.. so a binary compiled on Debian will also work fine on Red Hat or CentOS (as long as all dependencies are shipped)?
Spoofy
Hassan Syed
"they can compile it themselves" -- the aim is not to provide source code ;-)
liori
Well, one could test for mmx and sse at runtime and only use those codepaths then. It definitely requires more work, but there is no reason to preclude sse and mmx.
supercheetah
@liori Well I wanted to make the answer more general, it might be useful to others ;-)
Hassan Syed
@supercheetah true, and intel compilers already produce binaries that do this off-the-bat.
Hassan Syed
Well so I think I'm going to release some different version then.One universal binary with statically linked libraries 32bit and 64bit each.Additionally there will be few native packages for chosen major distributions.Thanks for all your help
Spoofy
+1  A: 

About compiling: boost, lua and zlib are small enough to be easily statically linked into your program, therefore you don't have to depend on whatever user have installed on his system. I don't know StormLib, so I wont speak about it, but I presume you can handle it similarly. If not, you might have lots of troubles: distributions tend to modify base packages to fit their needs, and your code might happen to be incompatible with some specific modifications.

This means that you code could be compiled only once for linux, and not for each specific linux distribution... be careful though, you probably will later want to use other libraries too. You will have more problems with UI toolkits: compiling GTK or QT statically makes your program not integrate into the user's desktop as cleanly as it could. Then you will actually need to compile your program for different distributions.

On the topic of packaging. As a linux user I think that the best way of distributing closed-source software is to let distributions do that (especially if they put your program (demo or some free version) in their repository -- this makes access to your program as easy as to other free apps). For this to work you just need to legally allow people to redistribute modified versions of your program. This is what some companies already do. You might consider packaging your code for some chosen distributions yourself (and say you officially support them) -- this is better for marketing.

If you have both free and paid versions, make them work similar enough to reuse packaging scripts. This way if some person X makes a package of your free version for some distribution Y, you will be able to reuse his work (you'll need to resolve copyright issues firstly) or just hire him to do the packaging of paid versions too. Hobbyists are sometimes very skilled at what they do, especially when they are trying to get some software working ;-) It is often that hobbyists do better work at packaging software than the company that makes the software itself.

However, if the program you are creating is not free, you should package it by yourself for every supported distribution, and also provide some distro-agnostic statically compiled version for those who are brave enough to use less popular distributions.

So:

  • Smaller dependencies can usually be compiled statically, and this is usually better for closed-source apps.
  • Choose some distributions and do the packaging by hand.
  • Allow hobbyists to do packages for less popular platforms.
liori
True these libraries are not really big but AFAIK StormLib will not allow static linking without a bunch of modifications (which I have absouloutly no clue of!). A QT GUI is planned for a later stage but it will be a completely independent program that interacts through an UDP interface. So basically I think I won't get around testing different distributions in a virtual machine.
Spoofy
Static linking is the safest route, because the linux dynamic linker does not have anything to say. Dynamic linking is slightly less safe, but with correct installation/packaging routine it is also possible.
liori