Dell XPS 13, problems with WiFi

A couple of months ago I bought a Dell XPS 13. I’m still very happy with the laptop, particularly given the target use that I have for it, but I have started noticing a list of problems that do bother me more than a little bit.

The first problem is something that I have spoken of in the original post and updated a couple of times: the firmware (“BIOS”) update. While the firmware is actually published through LVFS by Dell, either Antergos or Arch Linux have some configuration issue with EFI and the System Partition, that cause the EFI shim not to be able to find the right capsule. I ended up just running the update manually twice now, since I didn’t want to spare time to fix the packaging of the firmware updater, and trying with different firmware updates is not easy.

Also, while the new firmware updates made the electrical whining noise effectively disappear, making the laptop very nice to use in quiet hotel rooms (not all hotel rooms are quiet), it seems to have triggered more WiFi problems. Indeed, it got to the point that I could not use the laptop at home at all. I’m not sure what exactly was the problem, but my Linksys WRT1900ACv2 seems to trigger known problems with the WiFi card on this model.

At first I thought it would be a problem with using Arch Linux rather than Dell’s own Ubuntu image, that appeared to have separate Qualcomm drivers for the ath10k card. But it turns out the same error pops up repeated in Dell forums and LaunchPad too. A colleague with the same laptop suggested to just replace the card, getting rid of the whole set of problems introduced by the ath10k driver. Indeed, even looking around the Windows users websites, the recommendation appear to be the same: just replace your card.

The funny bit is that I only really noticed this when I came back from my long August trips, because since I bought the laptop, I hadn’t spent more than a few days at home at that point. I have been in Helsinki, Vancouver and Seattle, used the laptop in airports, lounges, hotels and cafes, as well as my office. And none of those places had any issue with my laptop. I used the laptop extensively to livetweet SREcon Europe from the USENIX wireless at the hotel, and it had no problem whatsoever.

My current theory for this is that there is some mostly-unused feature that is triggered by high-performance access point like the one I have at home, that runs LEDE, and as such is not something you’ll encounter in the wild. This also would explain why the Windows sites that I found referencing the problem are suggesting the card replacement — your average Windows user is unlikely to know how to do so or interested in a solution that does not involve shipping the device back to Dell, and to be fair they probably have a point, why on earth are they selling laptops with crappy WiFi cards?

So anyway my solution to this was to order an Intel 8265 wireless card which includes the same 802.11ac dual-band support and Bluetooth 4.2, and is the same format as the ath10k that the laptop comes with. It feels a bit strange having to open up a new laptop to replace a component, but since this is the serviceable version of Dell, it was not a horrible experience (my Vostro laptop still has a terrible 802.11g 2.4GHz-only card on it, but I can’t replace it easily).

Moving onto something else, the USB-C dock is working great, although I found out the hard way that if you ask Plasma, or whatever else it is that I ended up asking it to, not to put the laptop to sleep the moment the lid is closed, if the power is connected (which I need to make sure I can use the laptop “docked” onto my usual work-from-home setup), it also does not go to sleep if the power is subsequently disconnected. So the short version is that I now usually run the laptop without the power connected unless it’s already running low, and I can easily stay a whole day at a conference without charging, which is great!

Speaking of charging, turns out that the Apple 65W USB-C charger also works great with the XPS 13. Unfortunately it comes without a cable, and particularly with Apple USB-C cable your mileage may vary. It seems to be fine with the Google Pixel phone cable though. I have not tried measuring how much power and which power mode it uses, among other things because I wouldn’t know how to query the USB-C controller to get that information. If you have suggestions I’m all ears.

Otherwise the laptop appears to be working great for me. I only wish I could wake it up from sleep without opening it, when using it docked, but that’s also a minor feature.

The remaining problems are software. For instance Plasma sometimes crashes when I dock the laptop, and the new monitor comes online. And I can’t reboot while docked because the external keyboard (connected on the USB-C dock) is not able to type in the password for the full-disk encryption. Again this is a bother but not a big deal.

New laptop: Dell XPS 13 9360 “Developer Edition”

Since, as I announced some time ago, I’m moving to London in a few months, I’ve been spending the past few weeks organizing the move, deciding what I’ll be bringing with me and what I won’t. One of the things I decided to do was trying to figure out which hardware I would want with me, as I keep collecting hardware both for my own special projects and just out of curiosity.

I decided that having so many laptops as I have right now is a bad idea, and it is due time to consolidate on one or two machines if possible. In particular, my ZenBook has been showing its age, with only 4GB of RAM, and my older Latitude which is now over seven years old does not have a working battery anymore (but with 8GB of RAM it would actually been quite usable!), plus it’s way too bulky for me to keep travelling with, given my usual schedule. Indeed, to be able to have something I can play with on the road, I ended up buying an IdeaPad last year.

So thanks to the lowered value of the Sterling (which I won’t be particularly happy about once I start living there), I decided to get myself a new laptop. I decided for the Dell XPS 13, which is not quite an Ultrabook but it’s also quite handy and small. The killer feature of it for me has been having a USB-C connector and being able to charge through it, since my work laptop is a HP Chromebook 13, which also charges over USB-C, and that gives me the ability to travel with a single power brick.

I ordered it from Dell UK, delivered it to Northern Ireland then reshipped to me, and it arrived this past Friday. The configuration I bought is the i7, 16GB, QHD (3200×1800) display with Ubuntu (rather than Windows 10). I turned it on at the office, as I wanted to make sure it was all in one piece and working, and the first surprise was the musical intro that it started up with. I’m not sure if it’s Ubuntu’s or Dell’s but it’s annoying. I couldn’t skip it with the Esc button, and I didn’t figure out how to make it shut the heck up (although that may have been me not figuring out yet that the function keys are bound to special meanings first).

I also found myself confused by the fact that Dell only provided the BIOS (well, EFI) update file in MS-DOS/Windows format. Turns out that not only the firmware itself can read the file natively (after all EFI uses PE itself), but also Dells is providing the firmware through the LVFS service, that you may remember from Richard Hughes’s blog. The latest firmware for this device is not currently available, but it should be relatively soon.

Update (2017-07-26): The new firmware was release on LVFS and I tried updating it with the fwupd tool. Unfortunately the Arch Linux package does not work at all on my Antergos install. I’m not sure if it’s because the Antergos install changes some subtle parameter from the EFI install of Arch Linux itself, or because the package is completely broken. In particular it looks like the expected paths within the EFI System Partition (ESP) are completely messed up, and fwupd does not appear to identify them dynamically. Sigh.

The hardware of the laptop is pretty impressive, although I’m not a fan of the empty space near the top, that looks to me like an easy catch for cables and ties, which make me afraid for its integrity. The device is also quite denser than I was expecting: it’s quite heavier than the Zenbook, although it packs much more of a punch. The borderless screen is gorgeous but it also means the webcam is in the bottom corner of the screen rather than at the top, likely making it awkward to have a videocall. The keyboard is a bit tricky to get used to, because it’s not quite as good as the one in the ZenBook, but it’s still fairly good quality.

By the way, one of the first thing I did was replacing the Ubuntu install with an install of Antergos (which is effectively Arch Linux with an easier installer). This did mean disabling Secure Boot, but I guess I’ll have to live with it until we get a better idea of how to do Secure Boot properly on Linux and full-disk encryption.

Once I got home, I did what I do with my work laptop too: I connected it to my Anker USB-C dock, and it seemed to work alright. Except for some video corruption here and there, particularly on Konsole. Then I started noticing the lack of sound — but that turned out to be a red herring. The answer is that both the on-board speakers and the HDMI audio output are wired through the same sound interface, just appear as different “profiles”.

It wasn’t until I was already compiling webkitgtk for an hour that I noticed that the laptop wasn’t actually charging, and I thought the problem was with the dock. Instead the answered turned out to be that the HP Chromebook 13 charger is not compatible with the XPS 13, while the Chromebook Pixel charger worked fine. Why the difference? I’m not sure, I guess I need to figure out a way to inspect what is seen by the USB-C bus to figure out what the problem is with that charger. It should not be a problem of wattage, as both the HP charger and the original Dell charger provided with the laptop are 45W.

Speaking of the USB-C dock, there is a funny situation: if the laptop boots with it connected, and the lid closed, it does not appear to return the monitor on (all is fine if it boots with it disconnected). Also, it looks like the default DM provided by Antergos only shows the login page on the laptop’s screen, making it hard to log in at all. And in the usual mess that multi-screen support is with modern Linux desktops, Plasma needs to be killed and restarted to switch between the two monitors. Sigh!

As for the screen corruption that I have noted earlier, it seems to be fixed by one of these two options: upgrading to Linux 4.12 (from Arch Linux testing repository) or changing the compositor’s setting from OpenGL 2.0 to OpenGL 3.1. I think it may be the latter but I have no intention to try this out yet.

It looks like I’ll be very happy with this laptop, I just need to figure out some new workflows so that I don’t feel out of place not having Gentoo on my “workstation”.

Also, to keep with my usual Star Trek host naming, this new laptop is named everett after the (non-canon) USS Everett, which is the same class (Nova-class) as my previous laptop, which was named equinox.

Virtual rewiring, part two: the EC

In the previous post I explained what I want: to be able to use the caps lock key for Fn, at least for the arrow keys to achieve the page up/down, home and end keys (navigation keys).

After that post, I was provided a block schematics of my laptop identifying the EC in the system as an ITE IT8572. This is a bit unfortunate, because ITE is not known for sharing their datasheets easily, but at least I know that the EC is based on the Intel 8051 (also known as MSC-51), with a 64KiB flash ROM.

Speaking of the ROM, it’s possible to extract the EC firmware from the ASUS-provided update files Using (unmodified) UEFITool. Within the capsule, the EC firmware is the first padding entry, the non-empty one, you can extract with the tool, and then you have the actual ROM image file, that’s easy.

I was also pointed at Moravia Microsystems’ MCU 8051 IDE which is a fully-functional IDE for developing for 8051 MCUs. I submitted an ebuild for this while at 33C3, so that you can just emerge mcu8051ide to have a copy installed. It supports some optional runtime dependencies that I have not actually made optional yet. This IDE supports both the conversion of binary file to Intel HEX (why on Earth is Intel HEX still considered a good idea I’m not sure), disassembly of the binaries, and comes with its own (Tcl/Tk) assembler.

Unfortunately, this has not brought me quite as close as it might be expected knowing I have the firmware, a disassembler and an assembler. The reason is also not quite obvious either.

The first problem is that the IDE is unable to actually re-assemble the code it produces. Since disassembly (unlike decompilation) should be a lossless procedure, that was the first thing I tried, and it failed. There appears to be at least two big problems: the first is that the IDE does not have a configuration for a 64KiB ROM 8051 (even though that is the theoretical maximum size of the ROM for that device), and the other is that, since it does not have a way to define which part of the ROM are data and which ones are code, it disassemble the data in the ROM as instructions that are not actually valid for the base 8051 instruction set.

So, I decided to look into other options; unfortunately I found only a DJGPP-era disassembler – which produces what looks like a valid assembly file, but can’t be re-assembled – and a apparently promising Python-based one that failed to even execute due to a Python syntax error.

I have thus started working on writing my own, because why not, it’s fun, and it wouldn’t be the first time I go parsing instructions manually — though the last time, I was in high school and I wrote a very dumb 8086 emulator to try my homework out without having to wait in the queue at the lab for the horrible Rube Goldberg Machine we were using. This was some 15 years ago by now.

But back to present: to be able to write a proper disassembler that does not suffer the problems I noted above, I need to make sure I have a test that checks that re-assembling the disassembled code produces the same binary ROM as the source. Luckily, there is an obvious way to do so incrementally: you just emit every single byte of the ROM as a literal byte value. It’s not too difficult.

Except, which syntax do you use for that? The disassembler didn’t use any literal bytes (instead emitted extended instructions for bytes that would not otherwise be mapped in the base ISA), so I spent some time googling for 8051 syntax, and I found a few decent pointers but nothing quite right. From what I can tell, the SDCC assembler should accept the same syntax as Alan Baldwin’s assembler suite except for some of the more sophisticated instructions, as SDCC forked an earlier version of the same software. Even just opening the website should make it clear we’re talking serious vintage code here!

This syntax is also significantly different from the syntax used by MCU 8051 IDE, though. Admittedly, I was hoping to use the SDCC assembler for this (Baldwin’s is not quite obvious to build at first, as it effectively only provides .bat files for that) since that can be more easily scripted. The IDE is a Tcl/Tk full environment, and its assembler is very slow from what I can tell. Unfortunately, I have yet to find a way for the SDCC-provided assembler to produce any binary file. It’s all hidden behind flags and multi-level object files, sigh!

So I decided to at least make a file that assembles with the IDE. According to this page, the syntax should be quite simple:

LABEL: DB 2EH

The DB pseudo-instructions defining a literal byte or bytes. And that sounds exactly like what I need! So I just made my skeleton disassembler emit every byte with this syntax, and… it fails to compile. It looks like the IDE assembler only supports DB with decimal numbers, which makes them harder to read and match to the hexdump -C output I”ve been using to compare the binaries. Fixing that, also still made things not build right, but I have yet to look deeper into it.

Given that I’m at 33C3, and there was a talk about radare2 already (although I have not seen it yet, I’ll watch it at home), I decided to try using that, as it also already supports 8051, at least in theory. I say in theory because:

% radare2 -a 8051 ec212.bin
[0x00000000]> pd
*** invalid %N$ use detected ***
zsh: abort      radare2 -a 8051 ec212.bin

This is a known problem which is still unfixed, and that has been de-prioritized already, so if I want it fixed, I’ll have to fix it myself.

At this point, I have not much to work with. I started a very skeleton version of a disassembler, so I can start building the parsing I need. I have not done the paperwork yet to release it but I hope to do so soon, and develop it in the open as usual. I will also have to do some paperwork to submit a few fixes for MCU 8051 IDE, to support at least the basics of the ITE controller I have, guessed from the firmware itself, rather than with the datasheet, as I have no access to that as of yet.

If anybody knows anything I don’t and can point me to useful documentation, I’d really be happy to hear it.

Virtually rewiring laptop keyboards

You may remember I had problems with my laptop a few months before, because it refused to boot until I unplugged the CMOS battery. This by the way happened again, to the point I need to remember to buy a new CMOS battery next time I’m in the States (the european prices are crazy insane, and I’ll be back reasonably soon). This is the start of a story for the same laptop, but it has nothing to do with the CMOS in this case.

I have recently replaced my work laptop with an HP Chromebook from the previous MacBook Pro I was using. If you’re curious for my reasons, they boil down to traveling too much, and the MBP being too heavy. I briefly considered an Air, but given the direction they go to, the Chromebook works better for the work needs.

If you didn’t know, Chromebooks don’t come (by default) with a Caps Lock key. Maybe it’s a public service, making it more difficult to shout on the Internet, maybe it’s because whoever designed the keyboards was nostalgic of the control key in place of the caps lock, I’m not sure. Instead of moving the control key, they introduced a new search button, which triggers the search box as well as function as a “Fn” modifier, to access features such as page up/down, home and end. I liked the approach and it’s actually fairly handy. Unfortunately it means that now I have a third way (in addition to the Asus and the Dell keyboards) to access these functionalities, which makes my muscle memory suffer badly. It also meant I kept typing all-caps on my Asus laphttps://lkml.org/lkml/2008/6/6/480top when I tried using the modifier (and failed) and that was pissing me off.

On Apple USB and Bluetooth keyboards there is a Fn button, but it’s handled entirely in software. Indeed if you have one such keyboard, particularly the 60% version (those without numpad and separate isles for movement keys), and you want to use it on Linux you need to enable a kernel module to implement the correct emulation. I know that because it bit me when they first introduced it, as I was using a full-size Apple keyboard instead, and the numlock emulation was making me unable to type.

This is give or take the way it works on the Chromebook, mostly out of necessity of sharing the Fn modifier with the Search button. And it allows you to change which key is Search/Fn in software, which is handy. Why can’t I do that with my Asus laptop? Well, I can disable the Caps Lock at least, and replace it with Control like so many people do already, after all I use Emacs and they tell me it’s much better to use Emacs that way (I don’t know about it, I tried it briefly, but my muscle memory works better with the pinky-control). But that’s not exactly what I want.

I could try remapping Ctrl+arrows to behave the same way as Fn+arrows but that’s not quite what I want either because then I lose the skip-ahead/forwards that I want from Ctrl+arrows. So I need to come up with alternatives. Much as I wish this was going to be a step-by-step procedure to fix this, it’s not, and it’s instead a musing of what may or may not work.

The first option would be to implement the Fn in software, either by the kernel, X11 or libinput level. This could actually be interesting to make the Fn behaviour of Apple keyboards generic enough. I don’t really know where to start with that one, because between systemd, libinput and Wayland the input layers flow changed so much that I’m completely lost.

The other option is more daring and possibly more interesting: rewiring the laptop keyboard by changing what the keys actually send over the PS/2 bus. As Hector suggested over twitter, the keyboard is handled as part of the Embedded Controller (EC) firmware, and it is not untold of modifying a laptop’s EC although a quick search doesn’t turn up anyone doing so on an Asus laptop to change the keyboard scancodes.

Does it mean I can do it? Does it mean I will? I’m not sure yet. Part of the problem is that playing around with an EC is the kind of thing that can easily brick your laptop, and this is currently my only Linux environment in which I do actual work. I could try to re-target my HTPC to be a workstation, and then hack on this laptop like it’s disposable, but the truth is that I spend enough time in the air that I really want to have a laptop, at least as a secondary system.

The first problem is figuring out how run the update. The first step would be figuring out where the EC firmware is. In Matthew’s posts, he found a promising area of the update file within the image, based off a size and the (known) EC firmware version. In my case I don’t have that luck, since the only version I can see from the Linux host is the BIOS revision, which is 219. On the other hand, if I look at the Asus download page versions 212 and 216 explicitly mention an EC firmware update, so it would at least make it easy to verify whether my guess is right if I am to guess which area of the firmware image is the EC firmware itself.

But it might be easier. UEFITool supports reading these update files, as they are AMI Aptio capsules, and it should be possible then to extract a listing of object trees and checksum that tells you what actually changed between two versions. Unfortunately that would only tell you what and not how, but it’s a starting point. Unfortunately, the documentation of the tool itself already points out that many AMI features are not implemented because of the author’s NDA. Of course the moment when you look for aptio capsule format you find a post by Nikolaj’s about the AFU utility.

This may be a throw-in post just to give a random idea, or it may follow up with more details, and maybe some code to get the list of changed files in the capsule, but I have not started on this yet and I’m not sure I’ll do. The tools are out there, and it would be an interesting game to play, the problem is, nowadays, mostly the time.

Of the two options, implementing the second Fn key (without changing the one that is there) is obviously the one that has the most potential to be useful: if it can be made generic enough, it can be used on any keyboard, laptop or not, and might allow simplifying the Fn key handling in the Apple keyboards, by moving it away from a Apple-specific driver. So if someone has ideas of where this should fit nowadays, I’m happy to hear about those.

Saving a non-booting Asus UX31A laptop

I have just come back from a long(ish) trip through UK and US, and decided it’s time for me to go back to some simple OSS tasks, while I finally convince myself to talk about the doubts I’m having lately.

To start on that, I tried to turn on my laptop, the Asus UX31A I got three and a half years ago. It didn’t turn on. This happened before, so I just left it to charge and tried again. No luck.

Googling around I found a number of people with all kind of problems about it, and one of them is something getting stuck at the firmware level. Given how I had found a random problem with PCIE settings in my previous laptop, that would make it reboot every time I turned it off, but only if the power was still plugged in, I was not completely surprised. Unfortunately following the advice I read (take off the battery and power over AC) didn’t help.

I knew it was not the (otherwise common) problem with the power plug, because when I plugged the cable in, the Yubikey Neo-n would turn on, which means power arrived to the board fine.

Then I remembered two things: one of the advices was about the keyboard, and the keyboard itself has had problems before (the control key sometimes would stop working for half an hour at a time.) Indeed, once I re-seated the keyboards’ ribbon cable, it turned on again, yay!

But here’s the other problem: the laptop would turn on, the caps-lock LED on and stay there. And even letting the main battery run out would not be enough to return it to working conditions. What to do? Well, I got a hunch, and turned out to be right.

One of the things that I tried before was to remove the CMOS battery — either I kept it out not long enough to properly clear, or something else went wrong, but it turned out that removing the CMOS battery allowed the system to start up — but that would mean no RTC, which is not great, if you start the laptop without an Internet connection.

The way I solved it was as follows:

  • disconnect the CMOS battery;
  • start up the laptop;
  • enter “BIOS” (EFI) setup;
  • make any needed change (such as time);
  • “Save and exit”;
  • let the laptop boot up;
  • connect the CMOS battery.

Yes this does involve running the laptop without the lower plate for a while, be careful about it, but to the other hand, it did save my laptop from being stomped on, on the ground out of sheer rage.

Inspecting and knowing your firmware images

Update: for context, here’s the talk I was watching while writing this post.

Again posting about the Enigma conference. Teddy Reed talked about firmware security, in particular based on pre-boot EFI services. The video will be available at some point, it talks in details about osquery (which I’d like to package for Gentoo), but also has a lower-key announcement of something I found very interesting: VirusTotal is now (mostly) capable of scanning firmware images of various motherboard manufacturers.

The core of this implementation leverages two open-source tools: uefi_firmware by Teddy himself, and UEFITool by Nikolaj Schlej. They are pretty good but since this is still in the early stages, there are still a few things to iron out.

For instance, when I first scanned the firmware of my home PC it was reported with a clearly marker of malware, which made me suspicious – and indeed got ASUS to take notice and look into it themselves – but it looks like it was a problem with parsing the file, Teddy’s looking into it.

On the other hand, sticking with ASUS, my ZenBook shows in its report the presence of CompuTrace — luckily for me I don’t run this on Windows.

This tool is very interesting under many different point of views, because not only it will (maybe in due time, as firmware behaviour analysis improves) provide information about possibly-known malware (such as CompuTrace) in a firmware upgrade, before you apply it, but even before you even buy the computer.

And this is not just about malware. The information that VirusTotal provides (or to be precise the tools behind it) include information about certificates, which for instance told me that my home PC would allow me to install Ubuntu under SecureBoot, since the Canonical certificate is present — or, according to Matthew Garrett, it will allow an Ubuntu signed bootloaded to boot just about anything defeating SecureBoot altogether.

Unfortunately this only works for manufacturers that provide raw firmware updates right now. ASUS and Intel both do that, but for instance Dell devices will provide the firmware upgrade only as a Windows (or DOS) executable. Some old extraction instructions exist, but they are out of date. Thankfully, Nikolaj pointed me at a current script that works at least for my E6510 laptop — which by the way also has CompuTrace.

That script, though, fails with my other Dell laptop, a Vostro 3750 — in that case, you can get your hands on the BIOS image by simply executing it with Wine (it will fail with an obscure error message) and then fetching it from Wine’s temporary folder. Similarly, it does not work with the updater for the XPS 13 (which I’m considering buying to replace the Zenbook), and in this case Wine is not of enough help (because it does not

Unfortunately that script does not work with the more recent laptops such as the XPS13 that I’m considering buying, so I should possibly look into extending it if I can manage to get it work, although Nikolaj with much more experience than me tried and failed to get a valid image out of it.

To complete the post, I would like to thank Teddy for pointing the audience to Firmware Security — I know I’ll be reading a lot more about that soon!

Dell was a definite mistake, and an expensive one

This week, my newly-bought laptop arrived; as I noted I was looking for a computer that had a Trackpoint device, to avoid touching the touchpad while I’m writing. This brought me to exactly two viable alternatives:

  • Lenovo, with their Thinkpad, had a good track record with Linux; they also have usually a decent price (doesn’t mean they are cheap but that they are worth their pricetag) and in general they were my first choice;
  • Dell and the Latitude E65xx series was the second choice; they didn’t have such a good track record, but most people didn’t complain much about them, beside for minor annoyances.

Again, as I said above, Lenovo doesn’t sell directly in Italy, who knows why; unfortunately this is also the cause for their price to be quite higher in Italy in general. Also, I don’t have many options for Lenovo resellers in my area; the nearest one (50Km from here) sent me the price for “the model I asked for” after a week I asked for it: after VAT it was €300 more than the same from Lenovo UK. I also quoted this being the model I asked for because it really wasn’t!

Not only they still forced me to pick up a Windows license (okay, not too nice but I can live with it), but they also forgot to upgrade the RAM to 4GiB as I asked and, quite a bad one, they didn’t provide me with an integrated smartcard reader. Indeed, the first time around the guy at the sales department of the reseller asked me if I was sure to ask for a BTO about that, given that all T510 already had (translated, but literal) “reader of smartcards of SD type, those for photo cameras”. When the price came, they actually were proposing me to buy a T510 and a smartcard reader, from GemAlto and Lenovo branded. Not a ExpressCard reader though, an USB reader. I guess you can tell what the problem is with that; if not, the problem was that I already have an USB card reader, it’s just difficult to use on, say, a train.

So I surveyed the Dell options; they didn’t allow me to forgo on Windows 7 from the website, so I called the hotline and asked for a sales person to help me out; they refused right away to switch my keyboard for an US one, which was unfortunate but bearable, then they told me that they’d let me know if it was possible to avoid Windows; in two days they came back with an offer… for the E6510 with Ubuntu Linux rather than Windows 7, base options (no 4GiB memory, which I asked for, and no webcam, fingerprint reader, RFID reader, which I didn’t ask for and I didn’t/don’t need)… but at a €500 premium over the same laptop from the website with Windows 7. For the same price, I could get the highest-level option, with Core i7, all the extras and so on.

So the Dell arrived, and I started installing Gentoo on it; beside a series of quirks, which I googled up, such as the smartcard reader needing patching of OpenSC/OpenCT to work, or the fact that the RFID and fingerprint readers not working under Linux (with Daniel confirming that there is realistically no hope of getting the new ones to work on Linux any time soon), everything seemed to go fine. The network card is definitely stable and fast; the wireless card required me to create a new firmware ebuild because of the many ucode ebuilds we have in portage, mine was missing, but it was a matter of minutes.

The problem started when it came down to run Xorg. The nVidia card works quite fine with the drivers, so that’s not a problem, but the touchpad, oh the touchpad. For apparently no reason first, the touchpad was being recognised as a simple mouse; and contrarily to what most people told me about the Thinkpad laptops, the touchpad and the trackpoint in Dell’s hardware are not separate (hardware) devices; they appear as separate devices in Linux because the ALPS driver in the kernel splits them after parsing their different protocols. Unfortunately, both the E6400 and the E6500 Latitude models have a new Alps Electric combined GlidePoint device with a protocol that is, as of now, unsupported by the Linux kernel; Ubuntu submitted some patches for it to the kernel but none that works yet. Right now the device is seen as a standard PS/2 mouse and entirely handled in BIOS, without special features or settings.

The second problem came when it was time to turn off the laptop: halting the system causes it to reboot instead; I thought the problem was related to ACPI, so I looked up if there were BIOS updates, and lo and behold, two were around. Unfortunately, applying them is not the usual matter of running flashrom, on a laptop. Dell used to develop Linux tools for their laptops, including firmware update software; as far as I know they were the first vendor doing so. Unfortunately, they either stopped, or this model is not covered, and the only BIOS updates are provided bundled within the Windows-based flasher (which most likely is not the real flasher at all, given that the system reboots itself before flashing both BIOS and the firmware of the Embedded Controller.

Now, not everything goes bad, luckily for me. I mailed Matthew Garrett, to ask him for pointers on what I could try to get it at least to halt properly, and he’s suggested me a git tree to try which I’m now building. He also pointed out that there is at least some work going on to solve the Alps touchpad problem (which makes me hopeful that this will be properly solved before end of the year). And of course he didn’t have to tell me why the external monitor button prints a ‘p’ since I knew the unfortunate reasons already.

On the bright side, the hardware looks, by itself, tremendously nice: the keyboard, while not as good as the Apple Aluminiums is quite solid and nice to write on, the touchpad is not as invasive as on the MacBook Pro, the monitor is gorgeous and it has all kind of expansion ports including eSATA. The battery lasts a lot even on Linux and even without setting up the governors properly, and so on so forth. I just hope the few problems will smooth themselves out soon.

Bye Fedora

I’m going to say goodbye to my current Fedora 12 laptop; yes the one for which I wrote that post about Fedora 10 at the time which I then updated for Fedora 11. This is not because the laptop broke down, but rather because I ended up getting my MacBook Pro fixed, and that is again my main laptop. While I did want to have a laptop running Linux to the side of the MBP running Mac OS X, I finally decided it’s pretty pointless for me.

There are multiple reasons for that, some have nothing to do with Fedora, but a few have. Marginally maybe, but they have. The first problem is, once again, the video card. While it’s not like it has been easy with Yamato’s new one I got to say that two and a half months later I’m definitely glad I got it: KMS with 2.6.32 (and GIT userland — need to check whether that’s still needed, but I guess so for a while still) works like a charm, I’m able to use compiz without a glitch, it’s perfectly stable. With the nVidia on-board card of that laptop, it’s a totally different story. The nvidia binary driver for that card is not (yet?) available for Fedora 12, and the nouveau driver is… useless. It’s not just a matter of lacking 3D acceleration, but it’s also totally broken for suspension, which worked fine at least with the proprietary driver instead.

But it goes beyond the hardware support; probably you have all heard about the thunderstorm around Fedora’s original decision to allow any user with console access to install new packages without root password. I actually think that for Fedora’s target, that’s a pretty good move: it limits itself to installing and upgrading signed packages which has thus limited security implications, and it’s just a default. For most users, having console access is as good as having root’s password so it shouldn’t really matter; for desktop usage, that’s pretty much true already. Smarter, more security-paranoid users can easily change that setting. At any rate, the thunderstorm (or crapfest if you prefer) got them so much they changed the default again; too bad. Unfortunately, it seems instead that I got a different problem: my PackageKit interface is totally broken and I cannot use it at all; I got to use yum to upgrade my box which is definitely not so nice.

At first I thought it had to be related to either the fact that I upgraded from F11 or to my use of RPM Fusion but turns out that the PackageKit interface is as much broken on a box that a customer of mine set up for me to install a toolchain chroot for them last week. I ended up using yum there as well; no clue what the problem is with that.

And since I upgraded to F12 I found another problem as well: I already ranted about the fact that I couldn’t get bluetooth dial-up to work with my Nokia phone, and I had to use the cable to work it out; following Adam’s suggestion I also got the JoikuSpot application that turns the phone into an (ad-hoc) hotspot to use it via WLAN without configuring anything. The latter approach is, unfortunately, valid only if you’ve got the power adapter of your phone at hand, since it lasts about an hour on my E75; and the other day (at my customer’s office) I didn’t have it available. I had, though, the cable, left in the bag since the last time I used it, unfortunately when I tried to connect with that, exactly like I did in F11, NetworkManager decided to fail. And of course neither DUN nor PAN seems to be available via bluetooth in F12 as well as F11.

So I’m considering whether I need that laptop or not: the MBP starts up in less than two seconds, thanks to the fact I always leave it in Suspend-to-RAM (and that’s faster than Google’s Chrome OS… I wonder why people seem to challenge the start-up time rather than fixing the suspension support, bah); the MBP lasts more than four hours on its battery; the MBP have a much sleeker design which makes it handier and I don’t have to go around with the clunky power supply (not only because the MBP’s is smaller, but also because I have my mom’s supply downstairs if I’m running low on battery); the MBP (with OSX at least) can connect properly, via bluetooth, to the phone and thus the Internet (most of the times at least). So at the end, I’m not going to use the Compaq for much.

I’ll create a Fedora 12 virtual machine on Yamato for testing my projects there, where most of the previous notes about stuff not working properly will be moot points.

*Post scriptum: I wrote the draft for this article a couple of days ago and in the mean time I set up the Fedora 12 virtual machine I noted in the last paragraph; it was that way, by trying out virtio, that I found the n-th qemu/kvm quirk that made me drop the “proper” qemu. Unfortunately with that new install, from scratch, not update, I found another share of problems.*

*The remote desktop support in GNOME is totally broken: I can see with tcpdump the request arriving, but no reply is given altogether. If you set an hostname in three parts (say, fedora12.qemu.local), Avahi will advertise fedora12.local instead. system-config-services is not installed by default, and the first time I installed it I had to reboot otherwise I would only get crashes. One default cron job causes SELinux to report invalid accesses to /var/lib … all in all, it seems to me like Fedora 11 was way more polished!*

First impressions with Fedora Core^W 10

I know I shouldn’t be writing my impression based on the current state of FC10, I should wait for FC11 to be released and try that one, which probably addressed most of the problems I’m currently experiencing, as well as giving me a (probably) working nouveau driver instead of using the damned nvidia proprietary binary crap.

First I’d like to answer the question a few people asked me when I said that I put Fedora on this laptop; as I have said before, the laptop is likely crappy, at an hardware level. It sure was cheap which is all that I was looking for in a laptop, beside the presence of an useful keyboard. Putting Gentoo on it wasn’t impossible but was more work than I could afford; using Mac OS X on the MacBooks is mostly a pragmatic act of not wanting to waste an enormous amount of time.

I could have used any distribution at all, but I have chosen Fedora on both technical and political grounds. Technical, because I know quite a few people hacking at RedHat and Fedora, and I do trust them to be actually pretty nice guys, capable and driven more by technical reasoning than political when choosing what to do (okay there are exceptions here as well as anywhere else, but still…). Political because I dislike Ubuntu’s policies and the Debian way of not upstreaming patches enough.

The installation went fine, without using X though because of the nvidia card; I configured manually the card with rpmfusion extras and I configured a few extras. The installed system is pretty nice, fast, responsive and well-configured. Suspension works! That I really didn’t expect.

Unfortunately there are a few problems with a couple of programs: pidgin does not seem to be able to reconnect after losing the connection (no idea why), and xchat-gnome does not seem to complete the authentication at all. Beside these two, which I’ll check again on FC11 once released, I didn’t have many things to note on the software side.

Hardware-side, there is a quirk that I’m missing: while the laptop has a button that I guess is supposed to turn the touchpad on and off, it doesn’t seem to work; it’s not a showstopper but if anyone knows how to fix or work it around, I’d be glad to hear that given I’m writing a lot on this laptop already.

I could use a package for smuxi, which would actually allow me to set up a single IRC gateway on my home network; I guess I should look into Fedora packaging one day; who knows it might turn out handy for other projects as well. Speaking of smuxi and single-gateways, I start to hate Pidgin and its OTR: not only it’s a mess because if somebody changes client to something that does not request OTR by default it does not notify you of messages at all (I lost the count of how many times that happened to me), but also it goes positively nuts when two clients are connected for the same account (like I’m doing now with GTalk). I think I’ll just turn it off by default on both clients and only use it upon request. And since there is no easy way to connect two accounts to the “Windows Live Messenger” (MSN for the old friends) I also have another kind of obnoxious problem if I turn off one pidgin when I go to chat from the other.

Anyway that’s a topic for another post.

A new saga begins

So, as I have written a few months ago my last MacBook Pro is currently broken, and I have neither time nor money to repair it (the cost to repair it is almost as much as a new basic MacBook); unfortunately I had quite a few problems with this because, lacking a laptop, the idea of actually relaxing in front of the TV, watching a movie and writing down some blog post or article was just not possible.

I thought about getting one of those shiny “netbooks”, but given I want to write a lot on them, not just take a few spare notes, I didn’t want to go that way, as most of the keyboards I’ve seen are quite sucky. The only netbooks I had seen with decent keyboards started to be way out with the price, for a 10” screen: over €400, or even over €1000 for the smaller, nicer Sony versions. I know this because I spent the whole of yesterday’s afternoon looking at notebooks and netbooks with a friend of mine who was looking for a small unit (small size mattered for him); at the end, while he didn’t find what he was looking for, I came home with a new laptop.

Saying that it’s “not very good” would be quite a compliment for this laptop, but it gets the work done, it’s a cheap (€450) Compaq/HP notebook, 15.6” screen, 160GB hard drive and 4GB of RAM, with a Sempron CPU and (unfortunately) an nVidia graphic card. It came with Vista, but it didn’t last the time to skim through the settings, I replaced it not with Gentoo (not wanting to waste too much time on a laptop that might not last the warranty) but with Fedora Core 10 (I would have liked to put FC11 on it, but for now I skipped over it; I’ll upgrade, or reinstall, once it’s out, and at that time I’ll also switch from the nvidia proprietary crap to the open nouveau driver). Beside the graphic card that needed some fighting with, the laptop itself seems to work quite decently: ath5k wireless card, bluetooth has been recognized, audio too, Fedora installed fine and even suspension works, to some extent.

At any rate, with this I should finally be able to watch TV and still write some stuff so that I don’t waste all my time. Which is actually quite good because I don’t tend to actually like staying put in front of the TV screen, alone (it’s different when I’m at a cinema, but even that I am just getting used to, before last November I never had the pleasure of the experience to go with friends, only with school and that, man, was boring). So await more blog posts from me in the next few days, hopefully.