Autotools Mythbuster: so why do we have three projects?

As much as I’ve become an expert on the topic, there is one question I still have no idea how to answer, and that is why on earth we have three separate projects (autoconf, automake, libtool) instead of a single Autotools project. Things get even more interesting when you think that there is the Autoconf Archive – which, by the way, references Autotools Mythbuster as best practices – and then projects such as dolt that are developed by completely separate organisations.

I do think that this is a quite big drawback of autotools compared to things like CMake: you now have to allow for combinations of different tools written in different languages (autoconf is almost entirely shell and M4, automake uses lots of Perl, libtool is shell as well), with their own deprecation timelines, and with different distributions providing different sets of them.

My guess is that many problems lie in the different sets of developers for each project. I know for instance that Stefano at least was planning to have a separate Automake-NG implementation that did not rely on Perl at all, but used GNU make features, including make macros. I generally like this idea, because similarly to dolt it removes overhead for the most common case (any Linux distribution will use GNU make by default), while not removing the option where this is indeed needed (any BSD system.) On the other hand it adds one more dimension to the already multi-dimensional compatibility problem.

Having a single “autotools” package, while making things a bit more complicated on the organizational level, could make a few things fit better. For instance if you accepted Perl as a dependency of the package – since automake needs it; but remember this is not a dependency for the projects using autotools! – you could simplify the libtoolize script which is currently written in shell.

And it would probably be interesting if you could just declare in your configure.ac file whether you want a fully portable build system, or you’re okay with telling people that they need a more modern system, and drop some of the checks/compatibility quirks straight at make dist time. I’m sure that util-linux does not care about building dynamic libraries on Windows, and that PulseAudio does not really care for building on non-GNU make implementations.

Of course these musings are only personal and there is nothing that substantiate them regarding how things would turn out; I have not done any experiment with actually merging the packages into a single releasable unit, but I do have some experience with split-but-not-really software, and in this case I can’t see many advantages in the split of autotools, at least from the point of view of the average project that is using the full set of them. There are certainly reasons for which people would prefer them to be split, because especially if they have been using only autoconf and snobbing automake all this time, but… I’m not sure I agree with those reasons to begin with.

Autotools Mythbuster: what’s new in Automake 1.14

So the new version of Automake is out, and is most likely going to be the last release of the first major version. The next version of Automake is going to be 2.0, not to be confused with Automake NG which is a parallel project, still maintained by Stefano, but with a slightly different target.

After the various issues in 1.13 series Stefano decided to take a much more conservative approach for both 1.14 and the upcoming 2.0. While a bunch of features are getting deprecated with these two versions, they will not be dropped at least until version 3.0 I suppose. This mean that there should be all the time for developers to update their Autotools before they starting failing. Users of -Werror for Automake will of course still see issues, but I’ve already written about it so I’m not going back on the topic.

There are no big deals with the new release, by the way, as its topic seems to be mostly “get things straight”. For instance, the C compilation handling has been streamlined, with anticipation of further streamlining on Automake 2.0. In particular, the next major release will get rid of the subdir-objects option… by force-enabling it, which also means that the connected, optional AM_PROG_CC_C_O is now bolted on the basic AC_PROG_CC. What does this mean? Mostly that there is one fewer line to add to your configure.ac when you use subdir-objects, and if you don’t use subdir-objects today, you should. It also means that the compile script is now needed by all automake projects.

The only one new feature that I think is worth the release, is better support for including files within Makefile.am — this allows the creation of almost independent “module” files so that your build rules still live with the source files, but the final result is non-recursive. The changes make Karel’s way much more practical, to the point I’ve actually started writing documentation for it in Autotools Mythbuster.

# src/Makefile.inc

bin_PROGRAMS += myprog
man_MANS += %D%/myprog.8
myprog_SOURCES = %D%/myprog.c 
                    %D%/myprog-utils.c

The idea is that instead of knowing exactly what your subdirectory is that contains the sources, you can simply use %D% (or reldir) and then you can move said directory around. It makes it possible to properly handle a bundled-but-optout-capable library so that you don’t have to fight too much with the build system. I think that’ll actuall be the next post in the Autotools Mythbuster series, how to create a library project with a clear bundling path and, at the same time, the ability to use the system copy of the library itself.

Anyway, let’s all thank Stefano for a likely uneventful automake release. Autotools Mythbuster is being updated, for now you can find up to date forward porting notes but before coming back from vacation I’ll most likely update a few more sections.

Autotools Mythbuster: who’s afraid of libtool?

This is a follow-up on my last post for autotools introduction. I’m trying to keep these posts bite sized both because it seems to work nicely, and because this way I can avoid leaving the posts rotting in the drafts set.

So after creating a simple autotools build system in the previous now you might want to know how to build a library — this is where the first part of complexity kicks in. The complexity is not, though, into using libtool, but into making a proper library. So the question is “do you really want to use libtool?”

Let’s start from a fundamental rule: if you’re not going to install a library, you don’t want to use libtool. Some projects that only ever deal with programs still use libtool because that way they can rely on .la files for static linking. My suggestion is (very simply) not to rely on them as much as you can. Doing it this way means that you no longer have to care about using libtool for non-library-providing projects.

But in the case you are building said library, using libtool is important. Even if the library is internal only, trying to build it without libtool is just going to be a big headache for the packager that looks into your project (trust me I’ve seen said projects). Before entering the details on how you use libtool, though, let’s look into something else: what you need to make sure you think about, in your library.

First of all, make sure to have an unique prefix to your public symbols, be them constants, variables or functions. You might also want to have one for symbols that you use within your library on different translation units — my suggestion in this example is going to be that symbols starting with foo_ are public, while symbols starting with foo__ are private to the library. You’ll soon see why this is important.

Reducing the amount of symbols that you expose is not only a good performance consideration, but it also means that you avoid the off-chance to have symbol collisions which is a big problem to debug. So do pay attention.

There is another thing that you should consider when building a shared library and that’s the way the library’s ABI is versioned but it’s a topic that, in and by itself, takes more time to discuss than I want to spend in this post. I’ll leave that up to my full guide.

Once you got these details sorted out, you should start by slightly change the configure.ac file from the previous post so that it initializes libtool as well:

AC_INIT([myproject], [123], [flameeyes@flameeyes.eu], [https://flameeyes.blog/tag/autotools-mythbuster/])
AM_INIT_AUTOMAKE([foreign no-dist-gz dist-xz])
LT_INIT

AC_PROG_CC

AC_OUTPUT([Makefile])

Now it is possible to provide a few options to LT_INIT for instance to disable by default the generation of static archives. My personal recommendation is not to touch those options in most cases. Packagers will disable static linking when it makes sense, and if the user does not know much about static and dynamic linking, they are better off getting everything by default on a manual install.

On the Makefile.am side, the changes are very simple. Libraries built with libtool have a different class than programs and static archives, so you declare them as lib_LTLIBRARIES with a .la extension (at build time this is unavoidable). The only real difference between _LTLIBRARIES and _PROGRAMS is that the former gets its additional links from _LIBADD rather than _LDADD like the latter.

bin_PROGRAMS = fooutil1 fooutil2 fooutil3
lib_LTLIBRARIES = libfoo.la

libfoo_la_SOURCES = lib/foo1.c lib/foo2.c lib/foo3.c
libfoo_la_LIBADD = -lz
libfoo_la_LDFLAGS = -export-symbols-regex '^foo_[^_]'

fooutil1_LDADD = libfoo.la
fooutil2_LDADD = libfoo.la
fooutil3_LDADD = libfoo.la -ldl

pkginclude_HEADERS = lib/foo1.h lib/foo2.h lib/foo3.h

The _HEADERS variable is used to define which header files to install and where. In this case, it goes into ${prefix}/include/${PACKAGE}, as I declared it a pkginclude install.

The use of -export-symbols-regex ­– further documented in the guide – ensures that only the symbols that we want to have publicly available are exported and does so in an easy way.

This is about it for now — one thing that I haven’t added in the previous post, but which I’ll expand in the next iteration or the one after, is that the only command you need to regenerate autotools is autoreconf -fis and that still applies after introducing libtool support.

Autotools Mythbuster: who’s afraid of autotools?

I’ve been asked over on Twitter if I had any particular tutorial for an easy one-stop-shop tutorial for Autotools newbies… the answer was no, but I will try to make up for it by writing this post.

First of all, with the name autotools, we include quite a bit of different tools. If you have a very simple program (not hellow-simple, but still simple), you definitely want to use at the very least two: autoconf and automake. While you could use the former without the latter, you really don’t want to. This means that you need two files: configure.ac and Makefile.am.

The first of the two files (configure.ac) is processed to produce a configure script which the user will be executing at build time. It is also the bane of most people because, if you look at one for a complex project, you’ll see lots of content (and logic) and next to no comments on what things do. Lots of it is cargo-culting and I’m afraid I cannot help but just show you a possible basic configure.ac file:

AC_INIT([myproject], [123], [flameeyes@flameeyes.eu], [https://flameeyes.blog/tag/autotools-mythbuster/])
AM_INIT_AUTOMAKE([foreign no-dist-gz dist-xz])

AC_PROG_CC

AC_OUTPUT([Makefile])

Let me explain. The first two lines are used to initialize autoconf and automake respectively. The former is being told the name and version of the project, the place to report bugs, and an URL for the package to use in documentation. The latter is told that we’re not a GNU project (seriously, this is important — you wouldn’t believe how many tarballs I find with 0-sized files just because they are mandatory in the default GNU layout; even though I found at least one crazy package lately that wanted to have a 0-sized NEWS file), and that we want a .tar.xz tarball and not a .tar.gz one (which is the default).

After initializing the tools, you need to, at the very least, ask for a C compiler. You could have asked for a C++ compiler as well, but I’ll leave that as an exercise to the reader. Finally, you got to tell it to output Makefile (it’ll use Makefile.in but we’ll create Makefile.am instead soon).

To build a program, you need then to create a Makefile.am similar to this:

bin_PROGRAMS = hellow

dist_doc_DATA = README

Here we’re telling automake that we have a program called hellow (which sources are by default hellow.c) which has to be installed in the binary directory, and a README file that has to be distributed in the tarball and installed as a documentation piece. Yes this is really enough as a very basic Makefile.am.

If you were to have two programs, hellow and hellou, and a convenience library between the two you could do it this way:

bin_PROGRAMS = hellow hellou

hellow_SOURCES = src/hellow.c
hellow_LDADD = libhello.a

hellou_SOURCES = src/hellou.c
hellow_LDADD = libhello.a

noinst_LIBRARIES = libhello.a
libhello_a_SOURCES = lib/libhello.c lib/libhello.h

dist_doc_DATA = README

But then you’d have to add AC_PROG_RANLIB to the configure.ac calls. My suggestion is that if you want to link things statically and it’s just one or two files, just go for building it twice… it can actually makes it faster to build (one less serialization step) and with the new LTO options it should very well improve the optimization as well.

As you can see, this is really easy when done on the basis… I’ll keep writing a few more posts with easy solutions, and probably next week I’ll integrate all of this in Autotools Mythbuster and update the ebook with an “easy how to” as an appendix.

Autotools Mythbuster: automagically disabled dependencies

One question that I’ve been asked before, and to which I didn’t really have a good answer up to now is: should configure scripts fail, when a dependency is enabled explicitly, but it can’t be found? This is the automagic dependency problem, but on the other branch.

With proper automatic dependencies, if the user does not request explicitly whether to enable something or not, it’s customary that the dependency is checked for, and if found, the feature that it’s connected to is enabled. When the user has no way to opt out from it (which is bad), we call it an automagic dependency. But what happens if the user has requested it and the feature is not available?

Unfortunately, there is no standard for this, and myself I used both the “fail if asked and not found” and “warn if asked and not found” approaches. But the recent trouble between ncurses and freetype made me think that it’s important to actually make a point that there is a correct way to deal with this.

Indeed what happens is that right now, I have no way to tell you all that the tinderbox has found every single failure caused by sys-libs/ncurses[tinfo] even after the whole build completed: it might well be that a particular package, unable to link to ncurses, decided to disable it altogether. The same goes for freetype. Checking for all of that would be nice, but I have honestly no way to do it.

So to make sure that the user gets really what they want, please always make sure that you do verify that you’re proceeding how the user wanted. This makes sure that even in packaging, there won’t be any difference when a dependency is updated, or changed. In particular, with pkg-config, the kind of setup you should have is the following:

AC_ARG_WITH([foobar],
  AS_HELP_STRING([--with-foobar], [Enable the features needing foobar library]))

AS_IF([test "x$with_foobar" != "xno"], [
  PKG_CHECK_MODULES([FOOBAR], [foobar >= 123], [
    AC_DEFINE([HAVE_FOOBAR], [1], [Found foobar library])
  ], [
    AS_IF([test "x$with_foobar" = "xyes"], [
      AC_MSG_ERROR([$FOOBAR_PKG_ERRORS])
    ])
  ])
])

I’ll be discussing this and proposing this solution in the next update to Autotools Mythbuster (which is due anytime soon, including the usual eBook update for Kindle users). This would hopefully make sure that in the future, most configure scripts will follow this approach.

The odyssey of making an eBook

Please note, if you’re reading this post on Gentoo Universe, that this blog is syndicated in its full English content; including posts like this which is, at this point, the status of a project that I have to call commercial. So don’t complain that you read this on “official Gentoo website” as Universe is quite far from being an official website. I could understand the complaint if it was posted on Planet Gentoo.

I mused last week about the possibility of publishing Autotools Mythbuster as an eBook — after posting the article I decided to look into which options I had for self-publishing, and, long story short, I ended up putting it for sale on Amazon and on Lulu (which nowadays handles eBooks as well). I’ve actually sent it to Kobo and Google Play as well, but they haven’t finished publishing it yet; Lulu is also taking care of iBooks and Barnes & Nobles.

So let’s first get the question out of the way: the pricing of the eBook has been set to $4.99 (or equivalent) on all stores; some stores apply extra taxes (Google Play would apply 23% VAT in most European countries; books are usually 4% VAT here in Italy, but eBooks are not!), and I’ve been told already that at least from Netherlands and Czech Republic, the Kindle edition almost doubles in price — that is suboptimal for both me and you all, as when that happens, my share is reduced from 70 to 35% (after expenses of course).

Much more interesting than this is, though, the technical aspect of publishing the guide as an eBook. The DocBook Stylesheets I’ve been using (app-text/docbook-xsl-ns-stylesheets) provide two ways to build an ePub file: one is through a pure XSLT that bases itself off the XHTML5 output, and only creates the file (leaving to the user to zip them up), the other is a one-call-everything-done through a Ruby script. The two options produce quite different files, respectively in ePub 3 and ePub 2 format. While it’s possible to produce an ePub 3 book that is compatible with older readers, as an interesting post from O’Reilly delineates, but doing so with the standard DocBook chain is not really possible, which is a bummer.

At the end, while my original build was with ePub 3 (which was fine for both Amazon and Google Play), I had to re-build it again for Lulu which requires ePub 2 — it might be worth noting that Lulu says that it’s because their partners, iBookstore and Nook store, would refuse the invalid file, as they check the file with epubcheck version 1… but as O’Reilly says, iBooks is one of the best implementation of ePub 3, so it’s mostly an artificial limitation, most likely caused by their toolchain or BN’s. At the end, I think from the next update forward I’ll stick with ePub 2 for a little while more.

On the other hand, getting these two to work also got me to have a working upgrade path to XHTML 5, which failed for me last time. The method I’ve been using to know exactly which chapters and sections to break on their own pages on the output, was the manual explicit chunking through the chunk.toc file — this is not available for XHTML5, but it turns out there is a nicer method by just including the processing instructions in the main DocBook files, which works with both the old XHTML1 and the new XHTML5 output, as well as ePub 2 and ePub 3. While the version of the stylesheet that generated the website last is not using XHTML5 yet, it will soon do that, as I’m working on a few more changes (among which the overdue Credits section).

One of the thing that I had to be more careful with, with ePub 2, were the “dangling links” to sections I planned but haven’t written yet. There are a few in both the website and the Kindle editions, but they are gone for the Lulu (and Kobo, whenever they’ll make it available) editions. I’ve been working a lot last week to fill in these blanks, and extend the sections, especially for what concerns libtool and pkg-config. This week I’ll work a bit more on the presentation as well, since I still lack a real cover (which is important for eBook at least), and there are a few things to fix on the published XHTML stylesheet as well. Hopefully, before next week there will be a new update for both website and ebooks that will cover most of this, and more.

The final word has to clarify one thing: both Amazon and Google Books put the review on hold the moment when they found the content available already online (mostly on my website and at Gitorious), and asked me to confirm how that was possible. Amazon unlocked the review just a moment later, and published by the next day; Google is still processing the book (maybe it’ll be easier when I’ll make the update and it’ll be an ePub 2 everywhere, with the same exact content and a cover!). It doesn’t seem to me like Lulu is doing anything like that, but it might just have noticed that the content is published on the same domain as the email address I was registered with, who knows?

Anyway to finish it off, once again, the eBook version is available at Amazon and Lulu — both versions will come with free update: I know Amazon allows me to update it on the fly and just require a re-download from their pages (or devices), I’ll try to get them to notify the buyers, otherwise it’ll just be notifying people here. Lulu also allows me to revise a book, but I have no idea whether they will warn the buyers and whether they’ll provide the update.. but if that’s not the case, just contact me with the Lulu order identifier and I’ll set up so that you get the updates.

The future of Autotools Mythbuster

You might have noticed after yesterday’s post that I have done a lot of visual changes to Autotools Mythbuster over the weekend. The new style is just a bunch of changes over the previous one (even though I also made use of sass to make the stylesheet smaller), and for the most part is to give it something recognizable.

I need to spend another day or two working on the content itself at the very least, as the automake 1.13 porting notes are still not correct, due to further changes done on Automake side (more on this in a future post, as it’s a topic of its own). I’m also thinking about taking a few days off Gentoo Linux maintenance, Munin development, and other tasks, and just work on the content on all the non-work time, as it could use some documentation of install and uninstall procedures for instance.

But leaving the content side alone, let me address a different point first. More and more people lately have been asking for a way to have the guide available offline, either as ebook (ePub or PDF) or packaged. Indeed I was asked by somebody if I could drop the NonCommercial part of the license so that it can be packaged in Debian (at some point I was actually asked why I’m not contributing this to the main manuals; the reason is that I really don’t like the GFDL, and furthermore I’m not contributing to automake proper because copyright assignment is becoming a burden in my view).

There’s an important note here: while you can easily see that I’m not pouring into it the amount of time needed to bring this to book quality, it does take a lot of time to work on it. It’s not just a matter of gluing together the posts that talk about autotools from my blog, it’s a whole lot of editing, which is indeed a whole lot of work. While I do hope that the guide is helpful, as I wrote before, it’s much more work for the most part that I can pour into on my free time, especially in-between jobs like now (and no, I don’t need to find a job — I’m waiting to hear from one, and got a few others lined up if it falls through). While Flattr helps, it seems to be drying up, at least for what concerns my content; even Socialvest is giving me some grief, probably because I’m no longer connecting from the US. Beside that, the only “monetization” (I hate that word) strategy I got for the guide is AdSense – which, I remind you, kicked my blog out for naming an adult website on a post – and making the content available offline would defeat even the very small returns of that.

At this point, I’m really not sure what to do; on one side I’m happy to receive more coverage just because it makes my life easier to have fewer broken build systems around. On the other hand, while not expecting to get rich off it, I would like to know that the time I spend on it is at least partly compensated – token gestures are better than nothing as well – and that precludes a simple availability of the content offline, which is what people at this point are clamoring for.

So let’s look into the issues more deeply: why the NC clause on the guide? Mostly I want to have a way to stop somebody else exploiting my work for gain. If I drop the NC clause, nothing can stop an asshole from picking up the guide, making it available on Amazon, and get the money for it. Is it likely? Maybe not, but it’s something that can happen. Given the kind of sharks that infest Amazon’s self-publishing business, I wouldn’t be surprised. On the other hand, it would probably make it easier for me to accept non-minor contributions and still be able to publish it at some point, maybe even in real paper, so it is not something I’m excluding altogether at this point.

Getting the guide packaged by distributions is also not entirely impossible right now: Gentoo has generally not the same kind of issues as Debian regarding the NC clauses, and since I’m already using Gentoo to build and publish it, making an ebuild for it is tremendously simple. Since the content is also available on Git – right now on Gitorious, but read on – it would be trivial to do. But again, this would be cannibalizing the only compensation I got for the time spent on the guide. Which makes me very doubtful on what to do.

About the sources, there is another issue: while at the time I started all this, Gitorious was handier than GitHub, over time Gitorious interface didn’t improve, while the latter improved a lot, to the point that right now it would be my choice to host the guide: easier pull requests, and easier coverage. On the other hand, I’m not sure if the extra coverage is a good thing, as stated above. Yes, it is already available offline through Gitorious, but GitHub would make it effectively easier to get offline than to consult online. Is that what I want to do? Again, I don’t know.

You probably also remember an older post of mine from one and a half years ago where I discussed the reasons why I haven’t published Autotools Mythbuster at least through Amazon; the main reason was that, at the time, Amazon has no easy way to update the book for the buyers without having them buying a new copy. Luckily, this has changed recently, so the obstacle is actually fallen. With this in mind, I’m considering making it available as a Kindle book for those of you who are interested. To do so I have first to create it as an ePub though — so it would solve the question that I’ve been asked about the eBook availability… but at the same time we’re back to the compensation issue.

Indeed, if I decide to set up ePub generation and start selling it on the Kindle store, I’d be publishing the same routines on the Git repository, making it available to everybody else as well. Are people going to buy the eBook, even if I priced it at $0.99? I’d suppose not. Which brings me to not be sure what the target would be, on the Kindle store: price it down so that the convenience to just buy it from Amazon overweights the work to rolling your own ePub, or googling for a copy, – considering that just one person rolling the ePub can easily make it available to everybody else – or price it at a higher point, say $5, hoping that a few, interested users would fund the improvements? Either bet sounds bad to me honestly, even considering that Calcote’s book is priced at $27 at Amazon (hardcopy) and $35 at O’Reilly (eBook) — obviously, his book is more complete, although it is not a “living” edition like Autotools Mythbuster is.

Basically, I’m not sure what to do at all. And I’m pretty sure that some people (who will comment) will feel disgusted that I’m trying to make money out of this. On the whole, I guess one way to solve the issue is to drop the NC clause, stick it into a Git repository somewhere, maybe keep it running on my website, maybe not, but not waste energy into it anymore… the fact that, with the much more focused topic, it has just 65 flattrs, is probably indication that there is no need for it — which explains why I couldn’t find any publisher interested in making me write a book on the topic before. Too bad.

Autotools Mythbuster: automake pains

And we start the new year with more Autotools Mythbusting — although in this case it’s not with the help of upstream, who actually seemed to make it more difficult. What’s going on? Well, there has been two releases already, 1.13 and 1.13.1, and the changes are quite “interesting” — or to use a different word, worrisome.

First of all, there are two releases because the first one (1.13) was removing two macros (AM_CONFIG_HEADER and AM_PROG_CC_STDC) that were not deprecated in the previous release. After a complain from Paolo Bonzini related to a patch to sed to get rid of the old macros, Stefano decided to re-introduce the macros as deprecated in 1.13.1. What does this tell me? Well, two things mainly: the first is that this release has been rushed out without enough testing (the beta for it was released on December 19th!). The second that there is still no proper process in the deprecation of features with clear deadlines of when they are to disappear.

This impression is further strengthened in respect with some of the deprecation that appear in this new release, and some of the removals that did not happen at all.

This release was supposed to mark the first one not supporting the old-style name of configure.in for the autoconf input script — if you have any project still using that name you should update now. For some reason – none of which has been discussed on the automake mailing list, unsurprisingly – it was decided to postpone this to the next release. It still is a perfectly good idea to rename the files now, but you can probably get pissed easily if you felt pressurized into getting ready for the new release, and then the requirement is dropped without further notice.

Another removal that was supposed to happen with this release was the three-parameters AM_INIT_AUTOMAKE call, which substitutes the parameters of AC_INIT, instead of providing the automake options. The use of this macro is, though, still common for packages that calculate their version number dynamically, such as from the GIT repository itself, as it’s not possible to have a variable version passed to AC_INIT. Now, instead of just marking the feature as deprecated but keeping it around, the situation is that the syntax is no longer documented but it’s still usable. Which means I have to document it myself, as I find it extremely stupid to have a feature that is not documented anywhere, but is found in the wild. It’s exactly for bad decisions like this that I started Autotools Mythbuster.

This is not much different from what has happened with the AM_PROG_MKDIR macro, which was supposed to be deprecated/removed in 1.12, with the variables being kept around for a little longer — first it ended up being completely messed up in 1.12 to the point that the first two releases of that series dropped the variables which were supposed to stay around and the removal of the macro (but not o fthe variables) is now scheduled for 1.14 because, among others, GNU gettext is still using it — the issue has been reported, and I also think it has been fixed in GIT already, but there is no new release, nor a date for it to get fixed in a release.

All of this is already documented in Autotools Mythbuster even though there is more work to do.

Then there are things that changed, or were introduced in this release. First of all, silent rules are no longer optional — this basically means that the silent-rules option to the automake init is now a no-op, and the generated makefiles all have the silent rules harness included (but not enabled by default as usual). For me this meant a rewrite of the related section as now you have one more variant of automake to support. Then there finally is support in aclocal to get the macro directory selected in configure.ac — unfortunately this for me meant I had to rewrite another section of my guide to account for it, and now both the old and the new method are documented in there.

There are more notes in the NEWS file, and more things that are scheduled to appear in the next release, an I’ll try to cover them in my Autotools Mythbuster over the next week or so — I’ll expect this time I need to get into the details of Makefile.am like i have tried to avoid up to now. It’s quite a bit of work but it might be what makes the difference for so many autotools users out there that I really can’t avoid the task at this point. In the mean time, I welcome all support, be it through patches, suggestions, Flattr, Amazon, or whatever else — the easiest way is to show the guide around: not only it’ll reduce the headaches for me and the other distribution packagers to have people actually knowing how to work on autotools, but also the more people know about it, the more contributions are likely to come in. Writing Autotools Mythbuster is far from easy, and sometimes it’s not enjoyable at all, but I guess it’s for the best.

Finally, a word about the status of automake in Gentoo — I’m leaving to Mike to bump the package in tree, once he’s done that, I’ll prepare to run a tinderbox with it — hopefully just getting the reverse dependencies for automake would be enough, thanks to autotools.eclass. For when the tinderbox is running, I hope I’ll have all the possible failures covered in the guide, as it’ll make the job of my Gentoo peers much easier.

Autotools Mythbuster: being a foreigner is not a bad thing

This was a leftover post on my drafts’ list.. I just decided to post it as it is, even though there are a few things that are slightly out of date. Please bear with me.

Have you ever noticed that many projects ship in their tarball or, even worse, in their source repositories, files that are either empty or simply placeholder saying “look at this other file”? Most of the time these files are NEWS, ChangeLog, COPYING and INSTALL. In some corner cases, I even found packages that have files called INSTALL.real.

So what’s going on with this? Well, the problem comes from automake, and its ties to the GNU project it belongs to. The idea behind it is that the default settings of automake have to fit with the GNU projects. And GNU projects have a long list of coding styles, best practices, and policies that might sound silly (and some are) but are consistently followed by official projects.

These policies not only mandate the presence of a stable set of files (including those noted above, and a couple more), but also that the portability warnings are enabled, as the resulting Makefiles are supposed to be usable with non-GNU make implementations. So basically by default automake will mandate the presence of those files, the activation of some warnings’ classes, and that’s the reason why people do create those files even if they are not going to be used (either they are left zero sized or, worse, they get a single line referring to another file — I say worse because for zero sized files we can stop from installing them with simple checks, but for single-line references we require human intervention).

So how do you fix this? Well, it’s actually easy, you just have to pass the foreign option to the AM_INIT_AUTOMAKE macro — this way you’re telling automake that your project does not have to follow the GNU rules, which means that the files no longer have to be there and that if you want portability warnings you have to enable them explicitly. Which is very likely what you want.

Do note that the fact that the files are no longer mandatory does not mean that you can no longer use them. You’re actually suggested to keep most of them in your project, and actually install it properly. But trust me, you want to be a foreigner, in GNU land.

For details on AM_INIT_AUTOMAKE and the various automake flavours, you can see my guide which I also have to expand a little bit over the weekend.

Autotools Mythbuster: On parallel testing

A “For A Parallel World” crossover!

Since now the tinderbox is actually running pretty good and the logs are getting through just fine, I’ve decided to spend some more time expanding the Autotools Mythbuster guide with more content, in particular in areas such as porting for automake 1.12 (and 1.13).

One issue though which I’ll have to discuss in that guide soon, and for which I’m posting already right now, is parallel testing, because it’s something that is not really well known, and is something that, at least for Gentoo, involves the EAPI=5 discussion.

Build systems using automake have a default target for testing purposes called check. This target is designed to build and execute testcases, in a pretty much transparent way. Usually this involves two main variables: check_PROGRAMS and TESTS. The former defines the binaries to build for the testcases, the latter which testcases to run.

This is counter-intuitive and might actually sound silly, but in some cases you want to build test programs as binaries, but call scripts instead to compare them. This is often the case when you test a library, as you want to actually compare the output of a test program with the known-good output.

Now, up to automake 1.12, if you run make -j16 check, what is parallelized is only the building of the binaries and targets; you can for instance make use of this with check_DATA to preprocess some source files (I do that for unpaper which only ships in the repository the original PNG files of the test data), but if your tests take time, and you have little stuff that needs to be built, then running make -j16 check is not going to be a big win. This added with the chance that the tests might just not work in parallel is why the default up to now in Gentoo is to run the tests in series.

But that’s why recent automake introduced the parallel-tests option, which is actually going to be the default starting from 1.13. In this configuration, the tests are executed by a driver script, which launches multiple copies of them at once, and then proceeds with receiving the results. Note that this is just an alternative default test harness, and Automake actually supports custom harnesses as well, which may or may not be run in parallel.

Anyway, this is something that I’ll have to write about in more details in my guide — please be patient. In the mean time you can see unpaper as an example, as I just updated the git tree to make the best use of the parallel tests harness (it actually saved me some code).