Questing for the guide

I was playing some Oblivion while on the phone with a friend, when something came up to my mind, related to my recent idea of an autotools guide . The idea came up in my mind by mixing Oblivion with something that Jürgen was saying this evening.

In the game you can acquire the most important magical items in three ways: you can find them around (rarely), you can build them yourselves (by hunting creatures’ souls), you can pay for them with gold, or you can get them during quests. The latter are usually the most powerful but it’s not always true. At any rate, the “gold” option is rarely the one used because it’s a somewhat scarce resource. You might start to wonder what this has to do with the autotools guide that I’ve made public yesterday, but you might also have already seen where I’m going.

Since I’m the first one to know that money, especially lately, is a scarce resource, and that, me first, I’m the kind of person who’s glad to put in an effort with a market value three/four times more than whatever money I could afford to repay a favour, it would be reasonable for me to provide a way of “payment” through use of technical skills and effort.

So here is my alternative proposal: if you can get me a piece of code that I failed to find and I don’t have time to write, releasing it under a FOSS license (GPLv2+ is very well suggested; compatibility with GPL is very important anyway), and maintaining it until it’s almost “perfect”, I’ll exchange that for a comparable effort in extending the guide.

I’ll post these “quests” from time to time on the blog so you can see them and see whether you think you can complete them; I’ll have to find a way to index them though, for now it’s just a proposal so I don’t think I need to do this right away. But I can drop two ideas if somebody has time and is willing to work on them; both of them relate to IMAP and e-mail messages, so you’ve been warned. I’m also quite picky when it comes to requirements.

The first, is what Jürgen was looking at earlier: I need a way to delete the old messages from some GMail label every day. The idea is that I’d like to use GMail for my mailing lists needs (so I have my messages always with me and so on), but since keeping the whole archive is both pointless (there is gmane, google groups, and the relative archives) and expensive (in term of space used in the GMail IMAP account and of bandwidth needed to sync “All Mail”, via UMTS), I’d like to just always keep the last 3 weeks of e-mail messages. What I need, though, is something slightly more elaborated than just deleting the old messages. It has to be a script that I can run on a cron job locally, and connects to the IMAP server. It has to allow deleting the messages completely from GMail, which means dropping them in the Trash folder (just deleting them is not enough, you just remove the label), and emptying it too; it also has to be configurable on a per-label basis of time to keep the messages (I would empty the label with the release notifications every week rather than every three weeks), and hopefully be able to specify to keep unread messages longer, and consider flagged messages as protected. I don’t care much about implementation language but I’d frown up at things “exotic” like ocaml, smalltalk and similar since it would require me to install their environment. Perl, Python and Ruby all are fine, and Java is too since the thing would run just once a day and is not much of a slowdown to start the JVM for that. No X connection though.

The second is slightly simpler and could be coupled with the one before: I send my database backups from the server to my GMail e-mail address, encrypted with GPG and compressed with BZip2, and then split in message-sized chunks. I need a way to download all the messages and reassemble the backups, once a week, and store it on a flash card, using tar directly on it like it was a tape (no need for a filesystem should reduce the erase count). The email messages have the number of the chunk, the series of the backup (typo or bugzilla) and the date of backup all encoded in the subject. More points if it can do something like Apple’s Time Machine to keep backups each day for a week, each week for a month (or two) and then a backup a month up to two years.

So if somebody has the skill to complete these tasks and would be interested in seeing the guide expanded, well, just go for it!

Autotools Mythbuster! A guide!

I’ve been thinking about this for a while, and now I thought it was the time to implement it and make it public: I’d like to write some complete and clean documentation about autotools: autoconf, automake, libtool and all the others. Something that can show more practical information about autotools rather than the references shipping with them, and a way to collect the good information out of my blog.

Since this kind of task is, though, quite complex and time-consuming, I just can’t afford to get to it as it is in my spare time. Since I have little spare time and the one I have I’d rather not spend entirely on free software-related tasks or my health would likely get bad again. I already devote a lot of my spare time to Gentoo and at least a bit of it has to stay for myself. But, since I have been asked about this many times, I decided to take a stake to it.

Although I certainly would have loved to see it become a book, especially since that would have helped me pay for bills, hardware and everything else related to my work on Free Software and not, I’m afraid that is unlikely to ever happen. O’Reilly have it explicit in their guidelines that only native English speakers are welcome to submit their proposal. Alas, I’m not a native speaker (and if you’re one of the many still wondering whether I’m Spanish or whatever else, I’m Italian; yes I know the name Diego sounds Spanish).

So my idea is to work on it on a donation basis; that is if people are interested in getting it written down, I’ll work on it, otherwise I’ll just add some stuff now and then and keep writing the interesting bits on my blog. My idea is to get donations, and starting from €50 dedicate a given time a week to writing (Edit: I didn’t make it clear before, the €50 needs not to come from a single person at a single time, it’s just the point where I start to write it, donations stack up over different people and times). The documentation would still be available to everybody for free, under a CreativeCommons BY-NC-SA license (but still open to relicensing for other uses if you ask me — I know there was a site that had a way to declare that explicitly but I forgot the name, I remember it having to do with pineapples though).

But since I already have a lot of sparse documentation about autotools in this blog, to the point I often use it as reference for patches I submit around and bugs I open, why would you care if I were to write it in a comprehensive documentation? Well, as it is the documentation is very sparse, which means that you have to search around the blog to find it. Even though I do embed a Google Custom Search widget on the pages, it’s not really easy to find what you need most of the times.

Also, the blog posts suffer from their nature: I don’t go around editing them, if I have committed mistakes I usually correct them by posting something new; I also tend to write opinionated entries, so you can have me writing snarky remarks about KDE and CMake in a post that is supposed to provide information on porting to autoconf 2.64, without feeling bad at all: it’s my blog after all, but this also means that it’s not “professional” to link to that entry as a reference article. At the same time I don’t think this is material for articles, because they also suffer from the problem of being mostly “set in stone”, while autotools are not set in stone, and new useful tricks can be added easily.

I’m basically not asking anybody to pay me to tell you guys new useful tricks for autotools, or how to avoid mistakes or how to fix bugs. I’m going to continue doing that, I’m going to continue posting on the blog. What I’m actually asking to be paid for is, well, the editing in a form that can be easily searched and referenced. I’m also still going to answer to enquiries and help requests, so don’t worry about that, and I’m also going to pour the same amount of effort in what I do like I do every day.

So where is the beef? For now it’s really just one section ported from the post I already linked above, together with an explanation of how the thing is (hopefully) going to work, and you can find it on my site . I’m going to add at least two more sections this week, compatibly with the time I got; for anything more, feel free to chip in.

And before you tell me, yes I know that it’s a bit evil to also add the Google AdSense to the page. On the other hand if you use the AdBlock Plus extension for Firefox, or anything similar, you won’t notice it at all, since the container is set to disappear in that case. Please don’t think I made much money with that, but every bit helps, considering that out of 14 hours a day I spend in front of the computer, in the past month I probably had an average of 11 spent for Gentoo and other Free Software work, not paid for, if not for the few guys who thought of me (thank you! once again!).

Post Scriptum: the sources aren’t available yet since I have to check one thing out first, they’ll also go online later this week anyway, there are some useful pieces of reusable DocBook trickery in there. And I already fixed a bug in app-text/docbook-xsl-ns-stylesheets while preparing that up. Basically, they’ll get released together with a decent style (thanks Gilles for it).

Autotools Mythbuster! Something you might not know about ./configure options

There are a lot of reasons to use autotools over custom buildsystems, most of them relate to the fact that autotools contain reusable generic code that allows to provide users with common ways of doing the same thing among different projects. Almost all of these features are available in a form or another on the majority of buildsystems, but sometimes they get either suboptimally documented, if at all, and they are different project from project which makes it very difficult to deal with them in a common abstract way.

One of these feature is options to enable and disable features and optional dependencies, to avoid automagic dependencies which are a problem for advanced users and from-source distributors like us. While BSD makefiles have their knobs, and most software provides access through make variables or preprocessor macros to enable or disable features and other things, the configure script generated by autoconf provides both a common interface to those options, and a documentation for them.

There are, though, a few interesting notes about this very useful common interface because a lot of projects either misuse it, or don’t know how deep the autoconf boilerplate is, and reinvent parts of it with no good reason. Let me try to show some of the interesting bits about it.

The first thing to know about the two AC_ARG_ENABLE and AC_ARG_WITH macros is that their arguments are: name, description, if present, if not present. The common mistake here is to consider the last two arguments as if enabled and if disabled; I’ve written about that mistake already a few years ago . This is not the case, and thus checking whether the option is enabled or disabled will have to be done outside of the option declaration for completeness.

Another interesting note is that the third parameter, if omitted, will by default generate a $enable_name or $with_name variable with the content of the specified option at configure (defaulting to yes and no for the positive and negative options when an explicit parameter is not passed through). It is thus possible to get a default-enabled feature variable using code like this:

AC_ARG_ENABLE([feature], [...], , [enable_feature=yes])

AS_IF([test "$enable_feature" = "yes"], [...])

Which is very handy since it avoids having to create custom variables and checking for them repeatedly (again, this is already written in my automagic guide ).

In the example above I explicitly skipped writing the documentation of the option itself, since that is another point where a lot of projects have confusion. If you look at the output of ./configure --help, the default options are all well aligned and make use of as much space as it ś available on the terminal window you’re running it into. On the other hand, some projects’ custom options are instead badly aligned, tightened down on a side of the screen, splitted among multiple lines, or going over the horizontal boundary of the terminal. This is because the upstream developers tried to fake the same alignment of autoconf, without knowing that what it does is usually adapting to the actual output of the system.

So for instance you got stuff like this, coming from the gnumeric configure script:

                          Turn on compiler warnings
  --enable-iso-c          Try to warn if code is not ISO C
--disable-ssconvert     Do not build ssconvert (command line spreadsheet conversion tool)
--disable-ssindex       Do not build ssindex (spreadsheet indexer for beagle)
--disable-ssgrep        Do not build ssgrep (search for supplied strings in spreadsheet)
--disable-solver  Don't compile the solver
--enable-plugins="text html"  Compile only the listed plugins
  --enable-pdfdocs        Generate documentation in Portable Document Format

As you can see there are multiple lines that are aligned totally to the right, and that go long to the right, while some others try to align themselves, and keep some space to their left too. The reason can be found in the file:

  [--disable-ssconvert          Do not build ssconvert (command line spreadsheet conversion tool)],
  [], [enable_ssconvert=yes])
AM_CONDITIONAL(ENABLE_SSCONVERT, test x"$enable_ssconvert" = xyes)

While this time the third and fourth arguments are correct, there is something up with the second, that is the description of the option. It’s totally expanded in the configure file, spaces included. But since it would be silly to waste space and readability that way, autoconf already provides an easy way to deal with the problem, which is to use the AS_HELP_STRING macro (formerly AC_HELP_STRING):

  AS_HELP_STRING([--disable-ssconvert], [Do not build ssconvert (command line spreadsheet conversion tool)]),
  [], [enable_ssconvert=yes])
AM_CONDITIONAL(ENABLE_SSCONVERT, test x"$enable_ssconvert" = xyes)

which then produces:

                          Turn on compiler warnings
  --enable-iso-c          Try to warn if code is not ISO C
  --disable-ssconvert     Do not build ssconvert (command line spreadsheet
                          conversion tool)
  --disable-ssindex       Do not build ssindex (spreadsheet indexer for
  --disable-ssgrep        Do not build ssgrep (search for supplied strings in
  --disable-solver        Don't compile the solver
  --enable-plugins="text html"
                          Compile only the listed plugins
  --enable-pdfdocs        Generate documentation in Portable Document Format

It looks nicer, doesn’t it?

Hopefully, reminding people about this will allow projects to clean up their (or for those still using the old naming convention), or for their users to submit patches, so that the output is decently formatted and usable, even by automatic systems like zsh’s tab completion of ./configure options.

P.S.: since I’ve changed gnumeric script, I’m going to submit it upstream now, so no need to get to fix that.

The canonical target

I’ve already written about the common mistake of using AC_CANONICAL_TARGET in software that is not intended to be used as compiler. Since I’m now using my tinderbox extensively, I’ve thought it might have been a good idea to try checking how many packages actually do call that, that shouldn’t.

The test is really a quick and dirty bashrc snippet:

post_src_unpack() {
    find "${S}" -name -o -name | while read acfile; do
        acdir=$(dirname "$acfile")
        pushd "$acdir"
        autoconf --trace AC_CANONICAL_TARGET 2>/dev/null > "${T}"/flameeyes-canonical-target.log
        if [[ -s "${T}"/flameeyes-canonical-target.log ]]; then
            ewarn "Flameeyes notice: AC_CANONICAL_TARGET used"
            cat "${T}"/flameeyes-canonical-target.log

This provides me with enough information to inspect the software, and eventually provide patches to correct the mistake. As I said this is a very common mistake, and I’ve fixed quite a few packages for this. Not only it wastes time to identify the target system, but it also provides a totally misleading --target option to the configure script that confuses users and automatic systems alike; if we were to write a software to generate ebuilds out of the source tarball of a software with some basic common options, it would probably be confused a lot by the presence of such an option.

Since the whole build, host and target situation is often confusing, I’d like to try explaining it with some images, I think that might be a good way to show users and developers alike how the three machine definitions interact between each other. Since this is going to be a long post, in term of size, rather than content, because of the images, the extended explanation won’t be present in the feed.

Continue reading on my site for that.

To try explaining this in a very visual way, let’s say we have only three systems, a PowerBook laptop, a standard x86 computer, and a build service using x86-64 servers. The choice of PoweBook as smallest device has been conditioned by the fact it was the only decent image I could find on OpenClipart for a system that would have been easily seen as having a different architecture than the other two. I would have liked an ARM board, but it was wishing too much.

The first obvious case is having a native compiler, no cross-compiling involved at all:

In this case, both gcc’s configure script, the powerpc-linux-gnu-gcc compiler and the hellow program are executed on the same system: the laptop. This is the standard case you have on a Gentoo system when building stuff out of most ebuilds. In this case host, build and target machines are all one the same: powerpc-linux-gnu.

Then there is a very common case for embedded developers, cross-compilers:

In this case gcc’s configure (and thus build) is executed on a PC system, which also will run the powerpc-linux-gnu-gcc compiler, but the hellow program is still executed on the laptop. Host and build machines are i686-pc-linux-gnu while target is powerpc-linux-gnu.

The next scenarios is uncommon for standalone developers but is somewhat with binary distributions for smaller systems:

In this case there is a build service that starts up the build for the compiler, that will then be executed by the laptop directly. In this case the build machine is x86_64-pc-linux-gnu while both host and target are powerpc-linux-gnu.

The final scenario involves all three systems at once and shows exactly the difference between the three machine definitions:

In this case the build service prepares a cross-compiler executed on a PC that will build software to run on the laptop. The build machine is x86_64-pc-linux-gnu, the host machine is i686-pc-linux-gnu and the target is powerpc-linux-gnu.

Now, this works pretty well and sleek for compilers, but what about other software? In most cases you got just two systems involved at most, one that will run the software and one that will build it, so there is no need for a target definition, it’ll all be completed between build and host. And this is why you should not be calling AC_CANONICAL_TARGET unless you can figure out a far-fetched scenario where you can involve three computers with three different architectures, like in the last scenario.

Autotools Mythbuster: the channel

Although I started writing some entries trying to uncover the most common mistakes with autotools, I’m afraid it’s going to be difficult for me to cover the mistake sand to actually make them useful. But since I’d rather spend my time helping people with autotools rather than having to fight with crazy buildsystems when I need a new software that is not in portage, I figured out that ther eis one thing I can do to improve the situation: accept requests!

As I did write before, I already take care of fixing some common autotools mistakes when I do my checklist routine so it’s not worse to accept requests to look after some particular software’s buildsystem to help autotolizing it or to fix the autotools, or parallel make failures and stuff like that. For this reason I decided to open a channel on OFTC: #autotools-mythbuster, where you can drop in and ask questions, or request for a buildsystem to be reviewed. If you allow me, I’ll also write about the common problems found in that buildsystem if I think it might be useful for others to know.

If you don’t want to drop by in the channel, you can also write me and I’ll see to answer and check out the buildsystem. I’m tempted to open a mailing list on Google Groups or something so that eventual comments might be useful to others too, but I’m not sure how much time this will take me so I don’t want to take a bigger responsibility that I can actually handle.

I’m doing this volunteering, for free, to show the way so that I don’t have to deal with more custom build systems that just don’t work (the only non-autotools build system that I find quite good to deal with is FFmpeg, and even that I had to tweak more than a couple of times, and still does not work 100% properly). As such I don’t guarantee results, I might not be able to look after your code right away (and to begin with, when this entry will appear on Planet I’ll probably be busy and/or sleeping, since I’m writing this from my bed on the laptop). I’ll be happy to help you, and if I can get your buildsystem up to standard, that’ll be enough of a result for me (I’m happy to receive gifts though).

I’ll wait for you on #autotools-mythbuster @ OFTC then!

Autotools Mythbuster! You don’t want to canonicalise *that* (and I’m not referring to Ubuntu)

I have to reassure all my readers who actually care about Ubuntu (I mostly don’t): I’m not going to write about Canonical ltd or anything directly related to Ubuntu, or Debian for what matters. This is a blog post that tries to document some very often misused macros in autoconf, hoping to be of general help to developers of free software projects, but not limited to.

For one reason or another a lot of people are quite used to the triplet used for machine definition, stuff like i686-pc-linux-gnu or x86_64-pc-solaris2.11 or powerpc-apple-darwin8.6. Yes the same triplet that is used to define the CHOST value in make.conf in Gentoo. the same triplet you see in GCC, the same triplet that is passed every time at ./configure. This triplet, in autotools, is used in three machine definitions: host, build and target.

All software built with autotools have host and build: the former defines the system where the software will have to run at the end of the build process, the latter states which system is going to be used for building the software. When host and build differ, you’re cross-compiling. Compilers and all the build tools that are specific to a particular machine definition also need a target definition.

Since there are more than one way to define the same machine, autotools have two files, called together gnuconfig, that can reduce the triplets to some common values: config.sub and config.guess. These two files don’t enter in the picture unless you ask for them, though, since they can be superfluous. If your configure never checks the $host and $build variables, since it will identify all that is needed for its build via compile and run tests, it’s pointless to do the reduction, and pointless checks has to be avoided.

So this brings us to the two macros I referred to in the introduction of the blog post: AC_CANONICAL_HOST and AC_CANONICAL_TARGET. These two macros are used to reduce the definitions of host, build and target to their common form that can be more easily tested for. The reason why there are two macros is that the software is, if you think it’s obvious, that not all software needs to have a target. Indeed if you check the output of ./configure --help from a package that expects a target and one that does not, you’ll see that the whole --target support is gone.

In general, if your configure file does not have any test over $host or $build, you should not use AC_CANONICAL_HOST; if your software is not a compiler, and you don’t need to have a “target” system, you don’t need AC_CANONICAL_TARGET. An easy test for the latter would be: does your software have any sense to be called mingw32-software when used under Linux to build for Windows? If not, then you don’t need a target.

It is true that some macros do check the variables, and thus will call those macros themselves; luckily for us, the way autoconf works they’ll be expanded just once, and the problem is solved there: each macro is independent and requires the macros it’ll use.

Unfortunately, not all macros are good; you might remember some time ago SDL enabling AC_CANONICAL_TARGET by default, causing quite a bit of stir since it injected --target support into a huge amount of software that really didn’t need it at all. Similarly, just looking out of my /usr/share/aclocal, I see what I’ d consider a mistake in klibc’s macro file. It requires AC_CANONICAL_HOST and uses $build and $host to test for cross-compiling instead of relying on AC_CHECK_TOOL because it might accept just “klcc”. What is the problem? The problem here is that they assume that klcc might not be the right crosscompiler, and they try to be smarter than autotools.

It’s quite well possible that there is no prefixed name for a cross-compiling tool, because there are toolchains built in such a way that calling cc or ld runs the cross-compiler rather than the host compiler; there is also the option that instead of using the canonicalised host name, the tool is using the host triplet the user passed. And in the whole it runs more check than it should.

Unfortunately libtool still requires AC_CANONICAL_HOST to be tested, but at least you can identify packages that have too extended checks through AC_CANONICAL_TARGET by issuing the command autoconf --trace=AC_CANONICAL_TARGET; it’ll tell ou where in the (or file the macro is being called, from there you’ll be able to see to get rid of it, hopefully.

Best practices with autotools

This article was originally published on

Update: This article is obsoleted by Autotools Mythbuster that goes above and beyond all of this.

The core of GNU’s compile chain – the set of tools used to build GNU software packages – is the so-called “autotools,” a term that refers to the autoconf and automake programs, as well as libtoolize, autoheader, pkg-config, and sometimes gettext. These tools let you compile GNU software on a wide variety of platforms and Unix and Unix-like operating systems, providing developers a framework to check for the presence of the libraries, functions, and tools that they want to use. While autotools are great in the hands of an experienced developer, they can be quite a handful for the first-time user, and it’s not so rare that packages are shipped with working-but-broken autotools support. This article will cover some of the most common errors people make when using autotools and ways to achieve better results.

Regardless of anyone’s opinion about them, we currently have no valid alternative for autotools. Projects such as Scons are not as portable as autotools, and they don’t embody enough knowledge to be useful yet. We have tons of automatic checks with autotools, and a lot of libraries come with an m4 library with macros to check for their presence.

The basic structure of an autotooled project is simple. Autoconf uses a file (formerly written in m4 language to create a configure script with the help of an aclocal.m4 file (created by aclocal using the m4 libraries on its search path and acinclude.m4 file). For every directory there’s a file, used by automake to create the templates, which are processed and transformed in the real makefiles by the configure script. You can also avoid using automake and just write your own files, but this is quite complex, and you lose a few features of autotools.

In a file you can use macros you define yourself, the default ones provided by autoconf and aclocal, or external macros provided, for instance, by other packages. In such a case aclocal will create the aclocal.m4 file adding the library files it finds on the system’s library with the defined macros; this is a critical step to have a working autotooled project, as we’ll see in a moment.

A is mainly a declaration of intents: you can fill some targets variables with the name of the targets you want to build. These variables are structured in a format like placetoinstall_TYPEOFTARGET. The place is the location in a hierarchical Unix filesystem (bin, lib, include, …), a non-used keyword that can be defined with an arbitrary path (using the keyworddir variable), or the special keyword noinst that marks the targets that need not to be installed (for example private headers, or static libraries used during build). After naming the target, you can use the name (replacing dots with underscores) as the prefix for the variables that affects its build. In this way you can provide special CFLAGS, LDFLAGS, and LDADD variables used during the build of a single target, instead of changing them for all the targets. You can also use variables collected during configure phase, if you passed them to the AC_SUBST macro in, so that they are replaced inside makefiles. Also, though defining CFLAGS and LDFLAGS on a per-target basis seems useful, adding static flags in is a bad thing for portability, as you can’t tell if the compiler you’re using supports them, or if you really need them (-ldl put in LDFLAGS is a good example of a flag needed on Linux but not on FreeBSD); in such cases you should use to add these flags.

The most commonly used macros in are AC_CHECK_HEADERS, AC_CHECK_FUNCTS, and AC_CHECK_LIB, which test for the presence of, respectively, some header files, some library functions, and a given library (with a specific function in it). They are important for portability as they provides a way to check which headers are present and which are not (for example system headers that have different locations in different operating systems), and to check whether a function is present in the system library (asprintf() is missing in OpenBSD for example, while it’s present on GNU C library and FreeBSD), and finally to check for the presence of some third-party library or to see if a specific link to a library is needed to get some functions (for example dlopen() function is in libdl library on GNU systems, while it’s provided by the system’s C library on FreeBSD).

Along with testing for the presence or absence of functions or headers (and sometimes libraries) you usually need to change the code’s path (for example to avoid the use of missing functions, or to define a drop-in replacement for them). Autoconf is commonly coupled with another tool, autoheader, which creates a template, used by configure script to create a config.h header in which are defined a few preprocessor macros in form of HAVE_givenfunction or HAVE_givenheader_H which can be tested with #ifdef/#ifndef directives inside a C or C++ source file to change the code according to the features present.

Here are some practices to keep in mind to help you use autotools to create the most portable code possible.

The config.h header file should be considered to be an internal header file, so it should be used just by the single package in which it’s created. You should avoid editing the template to add your own code there, as this requires you to manually update it according to the you’re writing.

Unfortunately a few projects, such as Net-SNMP, export this header file with other libraries’ headers, which requires any projects that use their libraries to include them (or provide their own copy of the internal Net-SNMP structures). This is a bad thing, as the autotools structure of a library project should be invisible to software using it (which might not use autotools at all). Also, changes in autotools behavior are anything but rare, so you can have two identical checks with different results due to changes in the way they are executed. If you need to define your own wrappers or replacements in case something is not in the environment you’re compiling for, you should do that in private headers that do not get installed (declared as noinst_HEADERS in files).

Always provide the m4 files you used. As autotools have been in use for years, many packages (for example libraries) that can be reused by other programs provide an m4 library file in /usr/share/aclocal that makes it possible to check for their presence (for example using the -config scripts) with a simple macro call. These files are used by aclocal to create the aclocal.m4 file, and they usually are present on the developers’ systems where aclocal is executed to create the release, but when they are for optional dependencies, they can be missing on users’ systems. While this is usually not a problem, because users rarely executes aclocal, it’s a problem for source distributions, such as Gentoo, where sometimes you need to patch a or the and then re-run autoconf without having all the optional dependencies installed (or having different versions, which can be incompatible or bugged, of the same m4 file).

To avoid this problem, you should create an m4 subdirectory in your package’s directory and then put there the m4 library files you are using. You must then call aclocal with aclocal -I m4 options to search in that directory before the system library. You can then choose whether to put that directory under revision control (CVS, SVN, or whatever else you are using) or just create it for the releases. The latter case is the bare minimum requirement for a package. It minimizes the amount of revision-controlled code and ensures that you’re always using the latest m4 version, but has the drawback that anyone who checks out your repository won’t be able to execute autoconf without having to look on a release tarball to take the m4 from (and that might not work, as you can have updated the to suit a newer macro or added more dependencies). On the other hand, putting the m4 directory under revision control sometimes tempts the developers to change the macros to suit their needs. Although this seems logical, as the m4 files are under your revision control, it will upset many package maintainers, as sometimes new versions of m4 files fix bugs or support newer options and installation paths (for example multilib setups), and having the m4 files modified makes it impossible to just replace them with updated versions. It also mean that when you’re going to update an m4 file you must redo the modification against the original.

m4 files are always a problem to work with. They must replicate almost the same code from library to library (depending on the way you need to provide CFLAGS/LDFLAGS: with tests or with a -config script). To avoid this problem, the GNOME and FreeDesktop projects developed a tool called pkg-config, which provides both an executable binary and an m4 file to include in files, and lets developers check for the presence of a given library (and/or package), provided that the package itself installed a pkg-config .pc data file. This approach simplifies the work of maintaining scripts, and requires a lot less time to be processed during execution of configure script, as it uses the information provided by the installed package itself instead of just trying if it’s present. On the other hand, this approach means that an error the developers make concerning a dependency can break the user program, as they just hardcode the compiler and linker flags in the data file and the configure script doesn’t actually check whether the library works. Fortunately, this doesn’t happen too often.

To create the configure file, you need PKG_CHECK_MODULES, contained in the pkg.m4 library. You should add that file to your m4 directory. If pkg-config dependency is mandatory (as the tool is run by the configure script) you can’t be sure that the m4 file you are using is the same as one on users’ systems, nor you can be sure that it does not include extra bugs, as it can be older than yours.

Always check for the libraries you’re going to link to, if you have them as mandatory dependencies. Usually autoconf macros or pkg-config data files define prerequisite libraries that you need to successfully link to your library. Also, some functions that are in extra libraries in some systems (like dlopen() in libdl on Linux and Mac OS X) can be in the libc of another system (the same function is in libc on FreeBSD). In these cases you need to check whether the function can be found without linking to anything, or if you need to use a specific library (for example to avoid linking to a non-existent libdl that would fail where it’s not needed).

Be careful with GNU extensions. One of the things that makes portability a big pain is the use of extension functions, which are provided by GNU libc but aren’t present on other C libraries like BSD’s or uClibc. When you use such functions, you should always provide a “drop-in replacement,” a function that can provide the same functionality as the library function, maybe with less performance or security, which can be used when the extension function is not present on system’s C library. Those functions must be protected by a #ifdef HAVE_function ... #endif block, so that they don’t get duplicated when they are already present. Make sure that these functions are not exported by the library to the external users; they should be declared inside an internal header, to avoid breaking other libraries that may be doing similar tricks.

Avoid compiling OS-specific code when not needed. When a program optionally supports specific libraries or specific operating systems, it’s not rare to have entire source files that are specific to that code path. To avoid compiling them when they’re not needed, use the AM_CONDITIONAL macro inside a file. This automake macro (usable only if you’re using automake to build the project) allows you to define if .. endif blocks inside a file, inside which you can set special variables. You can, for example, add a platformsrcs variable that you set to the right source file for the platform to build for, then use in a _SOURCES variable.

However, there are two common errors developers make when using AM_CONDITIONAL. The first is the use of AM_CONDITIONAL in an already conditional branch (for example under an info or in a case switch), which leads to automake complaining about a conditional defined only conditionally (AM_CONDITIONAL must be called on global scope, out of every if block, so you must define a variable to contain the status of the conditions and then test against when calling the AM_CONDITIONAL). The other one is that you can’t change the targets’ variables directly, and you must define “commodity” variables, whose results empty out of the conditional, to add or remove source files and targets.

Many projects, to avoid compiling code for specific code paths, add the entire files in #ifdef ... #endif preprocessor conditionals. While this usually works, it makes the code ugly and error-prone, as a single statement out of the conditional block can be compiled where the source file is not needed. It also misleads users sometimes, as the source files seem to be compiled in situations where they don’t make sense.

Be smart in looking for operating system or hardware platform. Sometimes you need to search for a specific operating system or hardware platform. The right way to do this depends on where you need to know this. If you must know it to enable extra tests on configure, or you must add extra targets on makefiles, you must do the check in On the other hand, if the difference must be known in a source file, for example to enable an optional asm-coded function, you should rely directly on the compiler/preprocessor, so you should use #ifdef directives with the default macros enabled on the target platform (for example __linux__, __i386__, _ARCH_PPC, __sparc__, _FreeBSD_ and __APPLE__).

Don’t run commands in If you need to check for hardware or operating system in a, you should avoid using the uname command, despite this being one of the most common way to do such a test. This is actually an error, as this breaks crosscompilation. Autotools supports crosscompile projects from one machine to another using hosts definitions: strings in the form “hardware-vendor-os” (actually, “hardware-vendor-os-libc” when GNU libc is used), such as i686-pc-linux-gnu and x86_64-unknown-freebsd5.4. CHOST is the host definition for the system you’re compiling the software for, CBUILD is the host definition for the system you’re compiling on; when CHOST and CBUILD differ, you’re crosscompiling.

In the examples above, the first host definition shows an x86-like system, with a pentium2-equivalent (or later) processor, running a Linux kernel with a GNU libc (usually this refers to a GNU/Linux system). The second refers to an AMD64 system with a FreeBSD 5.4 operating system. (For a GNU/kFreeBSD system, which uses FreeBSD kernel and GNU libc, the host definition is hw-unknown-freebsd-gnu, while for a Gentoo/FreeBSD, using FreeBSD’s kernel and libc, but with Gentoo framework, the host definition is hw-gentoo-freebsd5.4.) By using $host and $build variables inside a script you can enable or disable specific features based on the operating system or on the hardware platform you’re compiling to or on.

Don’t abuse “automagic” dependencies. One of the most useful features of autotools are the automatic checks for the presence of a library, which are often used to automatically enable support for extra dependencies and such. However, abusing this feature makes the build of a package a bit of a problem. While this is quite useful for first-time users, and although most of the projects having complex dependencies (such as multimedia programs like xine and VLC) use a plugin-based framework that allows them to avoid most of the breakages, “automagic” dependencies are a great pain for packagers, especially ones working on source-based distributions such as Gentoo and ports-like frameworks. When you build something with automagical dependencies you enable the functions supported by the libraries found on the system on which the configure script is run. This means that the output binaries might not work on a system that shares the same base packages but misses one extra library, for example. Also, you can’t tell the exact dependencies of a package, as some might be optional and not be built when the libraries are not present.

To avoid this, autoconf allows you to add --enable/--disable and --with/--without options to configure scripts. With such options you can forcefully enable or disable a specific option (such as the support for an extra library or for a specific feature), and leave the default to automatic tests.

Unfortunately, many developers misunderstand the meaning of the two parameters of the functions used to add those options (AC_ARG_ENABLE and AC_ARG_WITH). They represent the code to execute when a parameter is passed and when one is not. Many developers mistakenly think that the two parameters define the code to execute when the feature is enabled and when is disabled. While this usually works when you pass a parameter just to change the default behavior, many source-based distributions pass parameters also to confirm the default behavior, which leads to errors (features explicitely requested missing). Being able to disable optional features if they don’t add dependencies (think of OSS audio support on Linux) is always a good thing for users, who can avoid building extra code if they don’t plan to use it, and prevents maintainers from doing dirty caching tricks to enable or disable features as their users request.

While autotools were a big problem for both developers and maintainers because there are different incompatible versions that do not get along well together (since they install in the same places, with the same names) and which are used in different combinations, the use of autotools saves maintainers from doing all sorts of dirty tricks to compile software. If you look at ebuild from Gentoo’s portage, the few that do not use autotools are the more complex ones, as they need to check variables on very different setups (we can or not have NPTL support; we can be on Linux, FreeBSD, or Mac OS X; we can be using GLIBC or another libc; and so on), while autotools usually take care of that on their own. It’s also true that many patches applied by maintainers are to fix broken autotools script in upstream sources, but this is just a little problem compared to the chaos of using special build systems that don’t work at all with little environmental changes.

Autotools can be quite tricky for newcomers, but when you start using them on a daily basis you find it’s a lot easier than having to deal with manual makefiles or other strange build tools such as imake or qmake, or even worse, special autotools-like build scripts that try to recognize the system they are building on. Autotools makes it simple to support new OSes and new hardware platforms, and saves maintainers and porters from having to learn how to custom-build a system to fix compilation. By carefully writing a script, developers can support new platforms without any changes at all.