Yes we still needs autotools

One of the most common refrains that I hear lately, particularly when people discover Autotools Mythbuster is that we don’t need autotools anymore.

The argument goes as such: since Autotools were designed for portability on ancient systems that nobody really uses anymore, and that most of the modern operating systems have a common interface, whether that is POSIX or C99, the reasons to keep Autotools around are minimal.

This could be true… if your software does nothing that is ever platform specific. Which indeed is possible, but quite rare. Indeed, unpaper has a fairly limited amount of code in its configure.ac, as the lowest level code it has, it’s to read and write files. Indeed, I could have easily used anything else for the build system.

But on the other hand, if you’re doing anything more specific, which usually includes network I/O, you end up with a bit more of a script. Furthermore, if you don’t want to pull a systemd and decide that the latest Linux version is all you want to support, you end up having to figure out alternatives, or at least conditionals to what you can and cannot use. You may not want to do like VLC which supports anything between OS/2 and the latest Apple TV, but there is space between those extremes.

If you’re a library, this is even more important. Because while it might be that you’re not interested in any peculiar systems, it might very well be that one of your consumers is. Going back to the VLC example, I have spent quite a bit of time in the past weekends of this year helping the VLC project by fixing (or helping to fix) the build system of new libraries that are made a dependency of VLC for Android.

So while we have indeed overcome the difficulties of porting across many different UNIX flavours, we still have portability concerns. I would guess that it is true that we should reconsider what Autoconf tests for by default, and in particular there are some tests that are not completely compatible for modern systems (for instance the endianness tests were an obvious failure when MacIntel arrived, as then it would be building the code for both big endian (PPC) and little endian (Intel) — on the other hand, even these concerns are not important anymore, as universal binaries are already out of style.

So yes, I do think we still need portability, and I still think that not requiring a tool that depends on XML RPC libraries is a good side of autotools…

Autotools Mythbuster: so why do we have three projects?

As much as I’ve become an expert on the topic, there is one question I still have no idea how to answer, and that is why on earth we have three separate projects (autoconf, automake, libtool) instead of a single Autotools project. Things get even more interesting when you think that there is the Autoconf Archive – which, by the way, references Autotools Mythbuster as best practices – and then projects such as dolt that are developed by completely separate organisations.

I do think that this is a quite big drawback of autotools compared to things like CMake: you now have to allow for combinations of different tools written in different languages (autoconf is almost entirely shell and M4, automake uses lots of Perl, libtool is shell as well), with their own deprecation timelines, and with different distributions providing different sets of them.

My guess is that many problems lie in the different sets of developers for each project. I know for instance that Stefano at least was planning to have a separate Automake-NG implementation that did not rely on Perl at all, but used GNU make features, including make macros. I generally like this idea, because similarly to dolt it removes overhead for the most common case (any Linux distribution will use GNU make by default), while not removing the option where this is indeed needed (any BSD system.) On the other hand it adds one more dimension to the already multi-dimensional compatibility problem.

Having a single “autotools” package, while making things a bit more complicated on the organizational level, could make a few things fit better. For instance if you accepted Perl as a dependency of the package – since automake needs it; but remember this is not a dependency for the projects using autotools! – you could simplify the libtoolize script which is currently written in shell.

And it would probably be interesting if you could just declare in your configure.ac file whether you want a fully portable build system, or you’re okay with telling people that they need a more modern system, and drop some of the checks/compatibility quirks straight at make dist time. I’m sure that util-linux does not care about building dynamic libraries on Windows, and that PulseAudio does not really care for building on non-GNU make implementations.

Of course these musings are only personal and there is nothing that substantiate them regarding how things would turn out; I have not done any experiment with actually merging the packages into a single releasable unit, but I do have some experience with split-but-not-really software, and in this case I can’t see many advantages in the split of autotools, at least from the point of view of the average project that is using the full set of them. There are certainly reasons for which people would prefer them to be split, because especially if they have been using only autoconf and snobbing automake all this time, but… I’m not sure I agree with those reasons to begin with.

Autotools Mythbuster: what’s new in Automake 1.14

So the new version of Automake is out, and is most likely going to be the last release of the first major version. The next version of Automake is going to be 2.0, not to be confused with Automake NG which is a parallel project, still maintained by Stefano, but with a slightly different target.

After the various issues in 1.13 series Stefano decided to take a much more conservative approach for both 1.14 and the upcoming 2.0. While a bunch of features are getting deprecated with these two versions, they will not be dropped at least until version 3.0 I suppose. This mean that there should be all the time for developers to update their Autotools before they starting failing. Users of -Werror for Automake will of course still see issues, but I’ve already written about it so I’m not going back on the topic.

There are no big deals with the new release, by the way, as its topic seems to be mostly “get things straight”. For instance, the C compilation handling has been streamlined, with anticipation of further streamlining on Automake 2.0. In particular, the next major release will get rid of the subdir-objects option… by force-enabling it, which also means that the connected, optional AM_PROG_CC_C_O is now bolted on the basic AC_PROG_CC. What does this mean? Mostly that there is one fewer line to add to your configure.ac when you use subdir-objects, and if you don’t use subdir-objects today, you should. It also means that the compile script is now needed by all automake projects.

The only one new feature that I think is worth the release, is better support for including files within Makefile.am — this allows the creation of almost independent “module” files so that your build rules still live with the source files, but the final result is non-recursive. The changes make Karel’s way much more practical, to the point I’ve actually started writing documentation for it in Autotools Mythbuster.

# src/Makefile.inc

bin_PROGRAMS += myprog
man_MANS += %D%/myprog.8
myprog_SOURCES = %D%/myprog.c 
                    %D%/myprog-utils.c

The idea is that instead of knowing exactly what your subdirectory is that contains the sources, you can simply use %D% (or reldir) and then you can move said directory around. It makes it possible to properly handle a bundled-but-optout-capable library so that you don’t have to fight too much with the build system. I think that’ll actuall be the next post in the Autotools Mythbuster series, how to create a library project with a clear bundling path and, at the same time, the ability to use the system copy of the library itself.

Anyway, let’s all thank Stefano for a likely uneventful automake release. Autotools Mythbuster is being updated, for now you can find up to date forward porting notes but before coming back from vacation I’ll most likely update a few more sections.

Autotools Mythbuster: who’s afraid of autotools?

I’ve been asked over on Twitter if I had any particular tutorial for an easy one-stop-shop tutorial for Autotools newbies… the answer was no, but I will try to make up for it by writing this post.

First of all, with the name autotools, we include quite a bit of different tools. If you have a very simple program (not hellow-simple, but still simple), you definitely want to use at the very least two: autoconf and automake. While you could use the former without the latter, you really don’t want to. This means that you need two files: configure.ac and Makefile.am.

The first of the two files (configure.ac) is processed to produce a configure script which the user will be executing at build time. It is also the bane of most people because, if you look at one for a complex project, you’ll see lots of content (and logic) and next to no comments on what things do. Lots of it is cargo-culting and I’m afraid I cannot help but just show you a possible basic configure.ac file:

AC_INIT([myproject], [123], [flameeyes@flameeyes.eu], [https://flameeyes.blog/tag/autotools-mythbuster/])
AM_INIT_AUTOMAKE([foreign no-dist-gz dist-xz])

AC_PROG_CC

AC_OUTPUT([Makefile])

Let me explain. The first two lines are used to initialize autoconf and automake respectively. The former is being told the name and version of the project, the place to report bugs, and an URL for the package to use in documentation. The latter is told that we’re not a GNU project (seriously, this is important — you wouldn’t believe how many tarballs I find with 0-sized files just because they are mandatory in the default GNU layout; even though I found at least one crazy package lately that wanted to have a 0-sized NEWS file), and that we want a .tar.xz tarball and not a .tar.gz one (which is the default).

After initializing the tools, you need to, at the very least, ask for a C compiler. You could have asked for a C++ compiler as well, but I’ll leave that as an exercise to the reader. Finally, you got to tell it to output Makefile (it’ll use Makefile.in but we’ll create Makefile.am instead soon).

To build a program, you need then to create a Makefile.am similar to this:

bin_PROGRAMS = hellow

dist_doc_DATA = README

Here we’re telling automake that we have a program called hellow (which sources are by default hellow.c) which has to be installed in the binary directory, and a README file that has to be distributed in the tarball and installed as a documentation piece. Yes this is really enough as a very basic Makefile.am.

If you were to have two programs, hellow and hellou, and a convenience library between the two you could do it this way:

bin_PROGRAMS = hellow hellou

hellow_SOURCES = src/hellow.c
hellow_LDADD = libhello.a

hellou_SOURCES = src/hellou.c
hellow_LDADD = libhello.a

noinst_LIBRARIES = libhello.a
libhello_a_SOURCES = lib/libhello.c lib/libhello.h

dist_doc_DATA = README

But then you’d have to add AC_PROG_RANLIB to the configure.ac calls. My suggestion is that if you want to link things statically and it’s just one or two files, just go for building it twice… it can actually makes it faster to build (one less serialization step) and with the new LTO options it should very well improve the optimization as well.

As you can see, this is really easy when done on the basis… I’ll keep writing a few more posts with easy solutions, and probably next week I’ll integrate all of this in Autotools Mythbuster and update the ebook with an “easy how to” as an appendix.

Autotools Mythbuster: automake pains

And we start the new year with more Autotools Mythbusting — although in this case it’s not with the help of upstream, who actually seemed to make it more difficult. What’s going on? Well, there has been two releases already, 1.13 and 1.13.1, and the changes are quite “interesting” — or to use a different word, worrisome.

First of all, there are two releases because the first one (1.13) was removing two macros (AM_CONFIG_HEADER and AM_PROG_CC_STDC) that were not deprecated in the previous release. After a complain from Paolo Bonzini related to a patch to sed to get rid of the old macros, Stefano decided to re-introduce the macros as deprecated in 1.13.1. What does this tell me? Well, two things mainly: the first is that this release has been rushed out without enough testing (the beta for it was released on December 19th!). The second that there is still no proper process in the deprecation of features with clear deadlines of when they are to disappear.

This impression is further strengthened in respect with some of the deprecation that appear in this new release, and some of the removals that did not happen at all.

This release was supposed to mark the first one not supporting the old-style name of configure.in for the autoconf input script — if you have any project still using that name you should update now. For some reason – none of which has been discussed on the automake mailing list, unsurprisingly – it was decided to postpone this to the next release. It still is a perfectly good idea to rename the files now, but you can probably get pissed easily if you felt pressurized into getting ready for the new release, and then the requirement is dropped without further notice.

Another removal that was supposed to happen with this release was the three-parameters AM_INIT_AUTOMAKE call, which substitutes the parameters of AC_INIT, instead of providing the automake options. The use of this macro is, though, still common for packages that calculate their version number dynamically, such as from the GIT repository itself, as it’s not possible to have a variable version passed to AC_INIT. Now, instead of just marking the feature as deprecated but keeping it around, the situation is that the syntax is no longer documented but it’s still usable. Which means I have to document it myself, as I find it extremely stupid to have a feature that is not documented anywhere, but is found in the wild. It’s exactly for bad decisions like this that I started Autotools Mythbuster.

This is not much different from what has happened with the AM_PROG_MKDIR macro, which was supposed to be deprecated/removed in 1.12, with the variables being kept around for a little longer — first it ended up being completely messed up in 1.12 to the point that the first two releases of that series dropped the variables which were supposed to stay around and the removal of the macro (but not o fthe variables) is now scheduled for 1.14 because, among others, GNU gettext is still using it — the issue has been reported, and I also think it has been fixed in GIT already, but there is no new release, nor a date for it to get fixed in a release.

All of this is already documented in Autotools Mythbuster even though there is more work to do.

Then there are things that changed, or were introduced in this release. First of all, silent rules are no longer optional — this basically means that the silent-rules option to the automake init is now a no-op, and the generated makefiles all have the silent rules harness included (but not enabled by default as usual). For me this meant a rewrite of the related section as now you have one more variant of automake to support. Then there finally is support in aclocal to get the macro directory selected in configure.ac — unfortunately this for me meant I had to rewrite another section of my guide to account for it, and now both the old and the new method are documented in there.

There are more notes in the NEWS file, and more things that are scheduled to appear in the next release, an I’ll try to cover them in my Autotools Mythbuster over the next week or so — I’ll expect this time I need to get into the details of Makefile.am like i have tried to avoid up to now. It’s quite a bit of work but it might be what makes the difference for so many autotools users out there that I really can’t avoid the task at this point. In the mean time, I welcome all support, be it through patches, suggestions, Flattr, Amazon, or whatever else — the easiest way is to show the guide around: not only it’ll reduce the headaches for me and the other distribution packagers to have people actually knowing how to work on autotools, but also the more people know about it, the more contributions are likely to come in. Writing Autotools Mythbuster is far from easy, and sometimes it’s not enjoyable at all, but I guess it’s for the best.

Finally, a word about the status of automake in Gentoo — I’m leaving to Mike to bump the package in tree, once he’s done that, I’ll prepare to run a tinderbox with it — hopefully just getting the reverse dependencies for automake would be enough, thanks to autotools.eclass. For when the tinderbox is running, I hope I’ll have all the possible failures covered in the guide, as it’ll make the job of my Gentoo peers much easier.

Autotools Mythbuster: being a foreigner is not a bad thing

This was a leftover post on my drafts’ list.. I just decided to post it as it is, even though there are a few things that are slightly out of date. Please bear with me.

Have you ever noticed that many projects ship in their tarball or, even worse, in their source repositories, files that are either empty or simply placeholder saying “look at this other file”? Most of the time these files are NEWS, ChangeLog, COPYING and INSTALL. In some corner cases, I even found packages that have files called INSTALL.real.

So what’s going on with this? Well, the problem comes from automake, and its ties to the GNU project it belongs to. The idea behind it is that the default settings of automake have to fit with the GNU projects. And GNU projects have a long list of coding styles, best practices, and policies that might sound silly (and some are) but are consistently followed by official projects.

These policies not only mandate the presence of a stable set of files (including those noted above, and a couple more), but also that the portability warnings are enabled, as the resulting Makefiles are supposed to be usable with non-GNU make implementations. So basically by default automake will mandate the presence of those files, the activation of some warnings’ classes, and that’s the reason why people do create those files even if they are not going to be used (either they are left zero sized or, worse, they get a single line referring to another file — I say worse because for zero sized files we can stop from installing them with simple checks, but for single-line references we require human intervention).

So how do you fix this? Well, it’s actually easy, you just have to pass the foreign option to the AM_INIT_AUTOMAKE macro — this way you’re telling automake that your project does not have to follow the GNU rules, which means that the files no longer have to be there and that if you want portability warnings you have to enable them explicitly. Which is very likely what you want.

Do note that the fact that the files are no longer mandatory does not mean that you can no longer use them. You’re actually suggested to keep most of them in your project, and actually install it properly. But trust me, you want to be a foreigner, in GNU land.

For details on AM_INIT_AUTOMAKE and the various automake flavours, you can see my guide which I also have to expand a little bit over the weekend.

GNU software (or, gnulib considered harmful and other stories)

The tinderbox right now seems to be having fun trying to play catch-up with changes in GNU software: GCC, Automake, glibc and of course GnuTLS. Okay it’s true that compatibility problems are not a prerogative of GNU, but there are some interesting problems with this whole game, especially for what concerns inter-dependencies.

So, there is a new C library in town, which, if we ignore the whole x32 dilemma, has support for ISO C11 as its major new feature. And what is the major change in this release? The gets() function has finally been removed. This is good, as it was a very nasty thing to use, and nobody in his sane mind would use it…

tbamd64 ~ # scanelf -qs -gets -R /bin /opt /sbin /lib /usr
gets  /opt/flexlm/bin/lmutil

Okay nevermind those who actually use it, the rest of the software shouldn’t be involved, should it? You wish. What happens is that since gnulib no longer only carries replacements for GNU extensions but also includes code that is not present on glibc itself, and extra warnings about use of deprecated features, it now comes with its own re-declaration of gets() to feature a prominent warning if it’s used. And of course, that makes it fail badly with the new gnulib.

Obviously, this has been fixed in gnulib already, since it was planned for gets() to be removed, but it takes quite a bit of time for a fix in gnulib to trickle down to the packages using it, which is one of my previous main complaints about it. Which means that Gentoo will have to patch the same code over and over again in almost all GNU software, since almost all of it uses gnulib.

Luckily for me only two packages have hit (with this problem, at least) that I’m in the herd of: GnuTLS and libtasn1 which is a dep of it. The former is fixed in version 3 which is also masked but I’m testing as well, while the latter is fixed in the current ~arch (I really don’t care about 2.16 in stable yet!), so there is nothing to patch there. The fact that GCC 4.6 itself fails to build with this version of glibc is obviously a different problem altogether, and so is the fact that we need Boost 1.50 for a number of packages to work with the new glibc/gcc combination, as 1.49 is broken with the new C library and 1.48 is broken with the new compiler.

Now to move on to something different: Automake 1.12 was released a couple of months ago and is now in ~arch, causing trouble although not as bad as it could have been. Interestingly enough, one of the things that they did change in this version was removing $(mkdir_p) as I wrote in my guide — but that seems to have been a mistake.

What should have been removed in 1.12 was the default use of AM_PROG_MKDIR_P, while the mkdir_p define should have been kept around until version 1.13. Stefano said he’s going to revert that change in automake 1.12.2, but I guess it’s better if we deal with it right away instead of waiting for 1.13 to hit us….

Of course there is a different problem with the new automake as well: GNU gettext hasn’t been updated to support the new automake versions, so using it causes deprecation warnings with 1.12 (and will fail with 1.13 if it’s not updated). And of course a number of projects are now using -Werror on automake, like it wasn’t enough trouble to use it on the source code itself.

And of course the problem with Gentoo, according to somebody, is the fact that my tinderbox bugs (those dozen a day) are filed with linked build logs instead of attached ones. Not the fact that the same somebody commits new versions of critical packages without actually doing anything to test them.

Gentoo Linux Health Report

Maybe I should start do this monthly, with the tinderbox’s results at hand.

So as I said, the Tinderbox is logging away although it’s still coughing from time to time. What I want to write about, here, is some of the insights the current logs tell me.

First of all, as you might expect considering my previous efforts, I’m testing GCC 4.7. It’s the least I can do of course, and it’s definitely important to proceed with this if we want to have it unmasked in decent terms, instead of the 4.6-like æons (that were caused in part by the lack of running continuous testing). The new GCC version by itself doesn’t seem to be too much of a break-through anyway; there is the usual set of new warnings that cause packages insisting in using -Werror to fail; there is some more headers’ cleanup that cause software using C and POSIX interfaces in C++ to fail because they don’t include the system headers declaring the functions they use; there also is a change in the C++ specs that require this-> to be prefixed on a few access to inherited attributes/methods but the rule of which I’m not sure of.

The most disruptive issue with the new GCC, though, seems to be another set of strengthening the invalid command-line parameters passed to the compiler. So for instance, the -mimpure-text option that is supported for SunOS is no longer usable on Linux — guess which kind of package fails badly due to that? Java native libraries of course. But it’s not just them, one or two packages failed due to the use of -mno-cygwin which is also gone from the Linux version of GCC.

So while this goes on, what other tests are being put in place? Well, I’ve been testing GnuTLS 3, simply because this version no longer bundles an antique, unmaintained configuration file parsing library. To be honest though I should be working a bit more on GnuTLS as there is a dependency over readline I’d like to remove.. for now I only fixed parallel build which means you can now use the ebuild at the best speed.

Oh and how can I forget that I’m testing the automake 1.12 transition which I’m also trying to keep documented in Autotools Mythbuster — although I’m still missing the “see also” references, I hope I’ll be able to work on it in the next few days. Who knows, maybe I’ll be able to work on them on the plane (given next time no matter what I’m going to get the “premium economy” extra — tired of children screaming, hopefully there are few families ready to pay for the extra).

The most interesting part of this transition is that we get failures for things that really don’t matter for us at all, such as the dist-lzma removal. Unfortunately we still have to deal with those. The other thing is that there are more packages than I expected that relied on what is called “auto-de-ANSI-fication” (conversion from ANSI C to more modern C), which is also gone on this automake version. Please remember that if you don’t want to spend too much time on fixing this you can still restrict WANT_AUTOMAKE=1.11 in your ebuild for the moment — but at some later point you’ll have to get upstream to support the new version.

I sill have some trouble with PHP extensions not installing properly. And I have no idea why.

Now, let’s leave alone the number of complains I could have with developers who commit outright broken packages without taking care of them, or three (or four) years old bugs still open and still present… I’d like for once to thank those developers who’re doing an excellent job by responding timely to the new bugs.

So, thank you Justin (jlec), Michael (kensington), Alfredo Tupone and Hans (graaff) among others. These are just the guys who received most of the bugs… and fixed them quickly enough for me to notice — “Hey did I just file a bug about that? Ah it’s fixed already.”

Autotools Mythbuster: On parallel testing

A “For A Parallel World” crossover!

Since now the tinderbox is actually running pretty good and the logs are getting through just fine, I’ve decided to spend some more time expanding the Autotools Mythbuster guide with more content, in particular in areas such as porting for automake 1.12 (and 1.13).

One issue though which I’ll have to discuss in that guide soon, and for which I’m posting already right now, is parallel testing, because it’s something that is not really well known, and is something that, at least for Gentoo, involves the EAPI=5 discussion.

Build systems using automake have a default target for testing purposes called check. This target is designed to build and execute testcases, in a pretty much transparent way. Usually this involves two main variables: check_PROGRAMS and TESTS. The former defines the binaries to build for the testcases, the latter which testcases to run.

This is counter-intuitive and might actually sound silly, but in some cases you want to build test programs as binaries, but call scripts instead to compare them. This is often the case when you test a library, as you want to actually compare the output of a test program with the known-good output.

Now, up to automake 1.12, if you run make -j16 check, what is parallelized is only the building of the binaries and targets; you can for instance make use of this with check_DATA to preprocess some source files (I do that for unpaper which only ships in the repository the original PNG files of the test data), but if your tests take time, and you have little stuff that needs to be built, then running make -j16 check is not going to be a big win. This added with the chance that the tests might just not work in parallel is why the default up to now in Gentoo is to run the tests in series.

But that’s why recent automake introduced the parallel-tests option, which is actually going to be the default starting from 1.13. In this configuration, the tests are executed by a driver script, which launches multiple copies of them at once, and then proceeds with receiving the results. Note that this is just an alternative default test harness, and Automake actually supports custom harnesses as well, which may or may not be run in parallel.

Anyway, this is something that I’ll have to write about in more details in my guide — please be patient. In the mean time you can see unpaper as an example, as I just updated the git tree to make the best use of the parallel tests harness (it actually saved me some code).

For A Parallel World: Parallel building is not passé

It’s been a while since I last wrote about parallel building. This has only to do with the fact that the tinderbox hasn’t been running for a long time (I’m almost set up with the new one!), and not with the many people who complained to me that spending time in getting parallel build systems to work is a waste of time.

This argument has been helped by the presence of a --jobs option to Portage, with them insisting that the future will have Portage building packages in parallel, so that the whole process will take less time, rather than shortening the single build time. I said before that I didn’t feel like it was going to help much, and now I definitely have some first hand experience to tell you that it doesn’t help at all.

The new tinderbox is a 32-way system; it has two 16-core CPUs, and enough RAM for each of them; you can easily build with 64 process at once, but I’m actually trying to push it further by using the unbound -j option (this is not proper, I know, but still). While this works nicely, we still have too many packages that force serial-building due to broken build systems; and a few that break in these conditions that would very rarely break on systems with just four or eight cores, such as lynx .

I then tried, during the first two rebuilds of world (one to set my choices in USE flags and packages, the other to build it hardened), running with five jobs in parallel… between the issue of the huge system set (yes that’s 4.24 years old article), and the fact that it’s much more likely to have many packages depending on one, rather than one depending on many, this still does not saturate the CPUs, if you’re still building serially.

Honestly seeing such a monstrous system take as much as my laptop, which is 14 in cores and 14 in RAM, to build the basic system was a bit… appalling.

The huge trouble seem to be for packages that don’t use make, but that could, under certain circumstances, be able to perform parallel building. The main problem with that is that we still don’t have a variable that tells us exactly how many build jobs we have to start, instead relying on the MAKEOPTS variable. Some ebuilds actually try to parse it to extract the number of jobs, but that would fail with configurations such as mine. I guess I should propose that addition for the next EAPI version… then we might actually be able to make use of it in the Ruby eclasses to run tests in parallel, which would make testing so much faster.

Speaking about parallel testing, the next automake major release (1.13 — 1.12 was released but it’s not in tree yet, as far as I can tell) will execute tests in parallel by default; this was optional starting 1.11 and now it’s going to be the default (you can still opt-out of course). That’s going to be very nice, but we’ll also have to change our src_test defaults, which still uses emake -j1 which forces serialisation.

Speaking about which, even if your package does not support parallel testing, you should use parallel make, at least with automake, to call make check; the reason is that the check target should also build the tests’ utilities and units, and the build can be sped up a lot by building them in parallel, especially for test frameworks that rely on a number of small units instead of one big executable.

Thankfully, for the day there are two more packages fixed to build in parallel: Lynx (which goes down from 110 to 46 seconds to build!) and Avahi (which I fixed so that it will install in parallel fine).