Tips on how to deploy C++ code to work every where

The product that I work on is not too different from this. We use an autoconf-based build system, and it works pretty well.

The place that you'll spend the most time, by far, is supporting users. User systems will have all sorts of wrinkles that you don't expect until they run into them, and you'll need to add more configure options to support them. Over time, we've added options to set the include and lib paths for every library we depend on; we've added options to change compile flags to work around various weird glitches in various versions of those libraries (or API changes from one version to another than need changes in our code), we've added workarounds for the fact that some BLAS libraries use a C interface and some use a Fortran interface so even though they're theoretically implementations of the same library they do a few things slightly differently, and so on. You can't anticipate all this in advance, and it also needs documenting so that users can figure out which options to set.

Oh, and installers are really a pain, because they generally are OS-dependent (unless it's just a shell script and you require CygWin), and the locations to install to tend to be OS-dependent, and so forth. That's another area that will take up time -- either in building a good installer, or in supporting users in manually setting things up.

Setting up cross-compile is, in my experience, well worth the trouble (at least for the Linux-to-Windows case; not sure about MacOS/X) -- much easier than trying to keep multiple different build systems in sync.

As an alternate perspective, there's the option that the OpenFOAM project uses for their rather large C++ library, which is to distribute it along with an "approved" G++ compiler and packages for all the other components, so that they don't have to worry about different compilers and so forth. But that really only works on one OS. I guess the Windows/MacOSX version of that is to provide pre-set-up VMWare images. In some cases, there's something to be said for that....


I would recommend CMake. Advantages:

  • It is very easy to use for building simple and complex projects with static libraries, dynamic libraries, executables and their dependencies.
  • It is platform independent and generates makefiles and/or ide project files for most compilers and IDEs.
  • It abstracts the differences between windows and unix, eg "libShared.so" and "Shared.dll" are referred to as "Shared" (cmake handles the name differences for each platform), if Shared is part of your project it sorts out the dependency if not it assumes that it is in the linker path.
  • It investigates the users system for compiler and 3rd party libraries that are required, you can then optionally remove components when 3rd party libraries are not available or display an error message (It ships with macros to find most common 3rd party libraries).
  • It can be run from the command line or with a simple gui that enables the user to change any of the parameters that were discovered above (eg compiler or version of 3rd party library).
  • It supports macros for automating common steps.
  • There is a component called CPack that enables you to create an installer, I think this is just a make install command line thing (I have not used it).
  • The CTest component integrates with other unit testing libraries like boost test or google test.

I use CMake for everything now, even simple test projects with visual studio.

I have never used autotools but a lot of other users have commented that cmake is easier to use. The KDE project moved to cmake from autotools for this reason.