EFF’s Panopticlick at Enigma 2016

One of the thing I was the most interested to hear about, at Enigma 2016, was news about EFF’s Panopticlick. For context, here is the talk from Bill Burlington:

I wrote before about the tool, but they have recently reworked and rebranded it to use it as a platform for promoting their Privacy Badger, which I don’t particularly care for. For my intents, they luckily still provide the detailed information, and this time around they make it more prominent that they rely on the fingerprintjs2 library for this information. Which means I could actually try and extend it.

I tried to bring up one of my concerns at the post-talk Q&A at the conference (the Q&A were not recorded), so I thought it wold be nice to publish my few comments about the tool as it is right now.

The first comment is this: both Panopticlick and Privacy Badger do not consider the idea of server-side tracking. I have said that before, and I will repeat it now: there are plenty of ways to identify a particular user, even across sites, just by tracking behaviour that are seen passively on the server side. Bill Budington’s answer to this at the conference was that Privacy Badger’s answer is allowing cookies only if if there is a policy in place from the site, and count on this policy being binding for the site.

But this does not mean much — Privacy Badger may stop the server from setting a cookie, but there are plenty of behaviours that can be observed without the help of the browser, or even more interestingly, with the help of Privacy Badger, uBlock, and similar other “privacy conscious” extensions.

Indeed, not allowing cookies is, already, a piece of trackable information. And that’s where the problem with self-selection, which I already hinted at before, comes to: when I ran Panopticlick on my laptop earlier it told me that one out of 1.42 browsers have cookies enabled. While I don’t have any access to facts and statistics about that, I do not think it’s a realistic number to say that about 30% of browsers have cookies disabled.

If you connect this to the commentaries on NSA’s Rob Joyce said at the closing talk, which unfortunately I was not present for, you could say that the fact that Privacy Badger is installed, and fetches a given path from a server trying to set a cookie, is a good way to figure out information on a person, too.

The other problem is more interesting. In the talk, Budington introduces briefly the concept of Shannon Entropy, although not by that name, and gives an example on different amount of entropy provided by knowing someone’s zodiac sign versus knowing their birthday. He also points out that these two information are not independent so you cannot sum their entropy together, which is indeed correct. But there are two problems with that.

The first, is that the Panopticlick interface does seem to think that all the information it gathers is at least partially independent and indeed shows a number of entropy bits higher than the single highest entry they have. But it is definitely not the case that all entries are independent. Even leaving aside browser specific things such as the type of images requested and so on, for many languages (though not English) there is a timezone correlation: the vast majority of Italian users would be reporting the same timezone, either +1 or +2 depending on the time of the year; sure there are expats and geeks, but they are definitely not as common.

The second problem is that there is a more interesting approach to take, when you are submitted key/value pair of information that should not be independent, in independent ways. Going back to the example of date of birth and zodiac sign, the calculation of entropy in this example is done starting from facts, particularly those in which people cannot lie — I’m sure that for any one database of registered users, January 1st is skewed as having many more than than 1/365th of the users.

But what happens if the information is gathered separately? If you ask an user both their zodiac sign and their date of birth separately, they may lie. And when (not if) they do, you may have a more interesting piece of information. Because if you have a network of separate social sites/databases, in which only one user ever selects being born on February 18th but being a Scorpio, you have a very strong signal that it might be the same user across them.

This is the same situation I described some time ago of people changing their User-Agent string to try to hide, but then creating unique (or nearly unique) signatures of their passage.

Also, while Panopticlick will tell you if the browser is doing anything to avoid fingerprinting (how?) it still does not seem to tell you if any of your extensions are making you more unique. And since it’s hard to tell whether some JavaScript bit is trying to load a higher-definition picture, or hide pieces of the UI for your small screen, versus telling the server about your browser setup, it is not like they care if you disabled your cookies…

For a more proactive approach to improve users’ privacy, we should ask for more browser vendors to do what Mozilla did six years ago and sanitize what their User-Agent content should be. Currently, Android mobile browsers would report both the device type and build number, which makes them much easier to track, even though the suggestion has been, up to now, to use mobile browsers because they look more like each other.

And we should start wondering how much a given browser extension adds or subtract from the uniqueness of a session. Because I think most of them are currently adding to the entropy, even those that are designed to “improve privacy.”

Inspecting and knowing your firmware images

Update: for context, here’s the talk I was watching while writing this post.

Again posting about the Enigma conference. Teddy Reed talked about firmware security, in particular based on pre-boot EFI services. The video will be available at some point, it talks in details about osquery (which I’d like to package for Gentoo), but also has a lower-key announcement of something I found very interesting: VirusTotal is now (mostly) capable of scanning firmware images of various motherboard manufacturers.

The core of this implementation leverages two open-source tools: uefi_firmware by Teddy himself, and UEFITool by Nikolaj Schlej. They are pretty good but since this is still in the early stages, there are still a few things to iron out.

For instance, when I first scanned the firmware of my home PC it was reported with a clearly marker of malware, which made me suspicious – and indeed got ASUS to take notice and look into it themselves – but it looks like it was a problem with parsing the file, Teddy’s looking into it.

On the other hand, sticking with ASUS, my ZenBook shows in its report the presence of CompuTrace — luckily for me I don’t run this on Windows.

This tool is very interesting under many different point of views, because not only it will (maybe in due time, as firmware behaviour analysis improves) provide information about possibly-known malware (such as CompuTrace) in a firmware upgrade, before you apply it, but even before you even buy the computer.

And this is not just about malware. The information that VirusTotal provides (or to be precise the tools behind it) include information about certificates, which for instance told me that my home PC would allow me to install Ubuntu under SecureBoot, since the Canonical certificate is present — or, according to Matthew Garrett, it will allow an Ubuntu signed bootloaded to boot just about anything defeating SecureBoot altogether.

Unfortunately this only works for manufacturers that provide raw firmware updates right now. ASUS and Intel both do that, but for instance Dell devices will provide the firmware upgrade only as a Windows (or DOS) executable. Some old extraction instructions exist, but they are out of date. Thankfully, Nikolaj pointed me at a current script that works at least for my E6510 laptop — which by the way also has CompuTrace.

That script, though, fails with my other Dell laptop, a Vostro 3750 — in that case, you can get your hands on the BIOS image by simply executing it with Wine (it will fail with an obscure error message) and then fetching it from Wine’s temporary folder. Similarly, it does not work with the updater for the XPS 13 (which I’m considering buying to replace the Zenbook), and in this case Wine is not of enough help (because it does not

Unfortunately that script does not work with the more recent laptops such as the XPS13 that I’m considering buying, so I should possibly look into extending it if I can manage to get it work, although Nikolaj with much more experience than me tried and failed to get a valid image out of it.

To complete the post, I would like to thank Teddy for pointing the audience to Firmware Security — I know I’ll be reading a lot more about that soon!

Usable security: the sudo security model

Edit: here’s the talk I was watching while writing this post, for context.

I’m starting writing this while I’m at Enigma 2016 listening to the usable security track. I think it’s a perfectly good time to start talk publicly about my experience trying to bring security to a Unix-based company I worked for before.

This is not a story of large company, the company I worked for was fairly small, with five people working in it at the time I was there. I’ll use “we” but I will point out that I’m no longer at that company and this is all in the past. I hope and expect the company to have improved their practices. When I joined the company, it was working on a new product, which meant we had a number of test servers running within the office and only one real “production” server for this running in a datacenter. In addition to the new product, a number of servers for a previous product were in production, and a couple of local testing servers for these.

While there was no gaping security hole for the company (otherwise I wouldn’t even be talking about it!) the security hygiene in the company was abysmal. We had an effective sysadmin at the office for the production server, and an external consultant to manage the network, but the root password (yes singular) of all devices was also known to the owner of the company, who also complained when I told them I wouldn’t share my password.

One of the few things that I wanted to set up there was a stronger authentication and stopping people from accessing everything with root privileges. For that stepping stone I ended up using, at least for the test servers (I never managed to put this into proper production), sudo.

We have all laughed at sudo make me a sandwich but the truth is that it’s still a better security mode than running as root, if used correctly. In particular, I did ask the boss what they wanted to run as root, and after getting rid of the need for root for a few actions that could be done unprivileged, I set up a whitelist of commands that their user could run without password. They were mostly happy not to have to login as root, but it was still not enough for me.

My follow-up ties to the start of this article, in particular the fact I started writing this while listening to Jon Oberheide. What I wanted to achieve was having an effective request for privilege escalation to root — that is, if someone were to actually find the post-it with the owner’s password they wouldn’t get access to root on any production system, even though they may be able to execute some (safe) routine tasks. At the time, my plan involved using Duo Security and a customized duo_unix so that a sudo request for any non-whitelisted command (including sudo -i) would require confirmation to the owner’s phone. Unfortunately at the time this hit two obstacles: the pull request with the code to handle PAM authentication for sudo was originally rejected (I’m not sure what the current state of that is, maybe it can be salvaged if it’s still not supported) and the owners didn’t want to pay for the Duo license – even just for the five of us, let alone providing it as a service to customers – even though my demo did have them quite happy about the idea of only ever needing their own password (or ssh key, but let’s not go there for now.)

This is just one of many things that were wrong in that company of course, but I think it shows a little bit that even in the system administration work, sometimes security and usability do go hand in hand, and a usable solution can make even a small company more secure.

And for those wondering, no I’m in no way affiliate with Duo, I just find it a good technology and I’m glad Dug pointed me at it a while back.