Yes we still needs autotools

One of the most common refrains that I hear lately, particularly when people discover Autotools Mythbuster is that we don’t need autotools anymore.

The argument goes as such: since Autotools were designed for portability on ancient systems that nobody really uses anymore, and that most of the modern operating systems have a common interface, whether that is POSIX or C99, the reasons to keep Autotools around are minimal.

This could be true… if your software does nothing that is ever platform specific. Which indeed is possible, but quite rare. Indeed, unpaper has a fairly limited amount of code in its, as the lowest level code it has, it’s to read and write files. Indeed, I could have easily used anything else for the build system.

But on the other hand, if you’re doing anything more specific, which usually includes network I/O, you end up with a bit more of a script. Furthermore, if you don’t want to pull a systemd and decide that the latest Linux version is all you want to support, you end up having to figure out alternatives, or at least conditionals to what you can and cannot use. You may not want to do like VLC which supports anything between OS/2 and the latest Apple TV, but there is space between those extremes.

If you’re a library, this is even more important. Because while it might be that you’re not interested in any peculiar systems, it might very well be that one of your consumers is. Going back to the VLC example, I have spent quite a bit of time in the past weekends of this year helping the VLC project by fixing (or helping to fix) the build system of new libraries that are made a dependency of VLC for Android.

So while we have indeed overcome the difficulties of porting across many different UNIX flavours, we still have portability concerns. I would guess that it is true that we should reconsider what Autoconf tests for by default, and in particular there are some tests that are not completely compatible for modern systems (for instance the endianness tests were an obvious failure when MacIntel arrived, as then it would be building the code for both big endian (PPC) and little endian (Intel) — on the other hand, even these concerns are not important anymore, as universal binaries are already out of style.

So yes, I do think we still need portability, and I still think that not requiring a tool that depends on XML RPC libraries is a good side of autotools…

TEXTRELs (Text Relocations) and their impact on hardening techniques

You might have seen the word TEXTREL thrown around security or hardening circles, or used in Gentoo Linux installation warnings, but one thing that is clear out there is that the documentation around this term is not very useful to understand why they are a problem. so I’ve been asked to write something about it.

Let’s start with taking apart the terminology. TEXTREL is jargon for “text relocation”, which is once again more jargon, as “text” in this case means “code portion of an executable file.” Indeed, in ELF files, the .text section is the one that contains all the actual machine code.

As for “relocation”, the term is related to dynamic loaders. It is the process of modifying the data loaded from the loaded file to suit its placement within memory. This might also require some explanation.

When you build code into executables, any named reference is translated into an address instead. This includes, among others, variables, functions, constants and labels — and also some unnamed references such as branch destinations on statements such as if and for.

These references fall into two main typologies: relative and absolute references. This is the easiest part to explain: a relative reference takes some address as “base” and then adds or subtracts from it. Indeed, many architectures have a “base register” which is used for relative references. In case of executable code, particularly with the reference to labels and branch destinations, relative references translate into relative jumps, which are relative to the current instruction pointer. An absolute reference is instead a fully qualified pointer to memory, well at least to the address space of the running process.

While absolute addresses are kinda obvious as a concept, they are not very practical for a compiler to emit in many cases. For instance, when building shared objects, there is no way for the compiler to know which addresses to use, particularly because a single process can load multiple objects, and they need to all be loaded at different addresses. So instead of writing to the file the actual final (unknown) address, what gets written by the compiler first – and by the link editor afterwards – is a placeholder. It might sound ironic, but an absolute reference is then emitted as a relative reference based upon the loading address of the object itself.

When the loader takes an object and loads it to memory, it’ll be mapped at a given “start” address. After that, the absolute references are inspected, and the relative placeholder resolved to the final absolute address. This is the process of relocation. Different types of relocation (or displacements) exists, but they are not the topic of this post.

Relocations as described up until now can apply to both data and code, but we single out code relocations as TEXTRELs. The reason for this is to be found in mitigation (or hardening) techniques. In particular, what is called W^X, NX or PaX. The basic idea of this technique is to disallow modification to executable areas of memory, by forcing the mapped pages to either be writable or executable, but not both (W^X reads “writable xor executable”.) This has a number of drawbacks, which are most clearly visible with JIT (Just-in-Time) compilation processes, including most JavaScript engines.

But beside JIT problem, there is the problem with relocations happening in code section of an executable, too. Since the relocations need to be written to, it is not feasible (or at least not easy) to provide an exclusive writeable or executable access to those. Well, there are theoretical ways to produce that result, but it complicates memory management significantly, so the short version is that generally speaking, TEXTRELs and W^X techniques don’t go well together.

This is further complicated by another mitigation strategy: ASLR, Address Space Layout Randomization. In particular, ASLR fully defeats prelinking as a strategy for dealing with TEXTRELs — theoretically on a system that allows TEXTREL but has the address space to map every single shared object at a fixed address, it would not be necessary to relocate at runtime. For stronger ASLR you also want to make sure that the executables themselves are mapped at different addresses, so you use PIE, Position Independent Executable, to make sure they don’t depend on a single stable loading address.

Usage of PIE was for a long while limited to a few select hardened distributions, such as Gentoo Hardened, but it’s getting more common, as ASLR is a fairly effective mitigation strategy even for binary distributions where otherwise function offsets would be known to an attacker.

At the same time, SELinux also implements protection against text relocation, so you no longer need to have a patched hardened kernel to provide this protection.

Similarly, Android 6 is now disallowing the generation of shared objects with text relocations, although I have no idea if binaries built to target this new SDK version gain any more protection at runtime, since it’s not really my area of expertise.

Impressions of Android Wear in everyday life

All readers of this blog know I’m a gadgeteer, by now. I have been buying technogizmos at first chance if I had the money for it, and I was thus an early adopter of ebooks back in the days. I have, though, ignored wearables for various reasons.

Well, it’s not strictly true — I did try Google Glass, in the past year. Twice, to be precise. Once the “standard” version, and once a version with prescription lenses – not my lenses though, so take it with a grain of salt – and neither time it excited me. In particular the former wouldn’t be an option due to my need for prescription glasses, and the latter is a terrible option because I have an impression that the display is obstructing too much of the field of vision in that configuration.

_Yes, I know I could wear contact lenses, but I’m scared of them so I’m not keeping them in mind. I’m also saving myself the pain in the eye for when smart contact lenses will tell me my blood glucose levels without having to prick myself every day._

Then smartwatches became all the rage and a friend of mine actually asked me whether I was going to buy one, since I seemed to be fond of accessories… well, the truth is that I’m not really that fond of them. It just gives the impression because I always have a bag on me and I like hats (yup even fedoras, not trilbies, feel free to assassinate my character for that if you want.)

_By the way, the story of how I started using satchels is fun: when I first visited London, I went with some friends of mine, and one of the things that we intended on doing was going to the so-called Gathering Hall that Capcom set up for players of Monster Hunter Freedom Unite. My option to bring around the PSP were pants’ pockets or a cumbersome backpack — one of my friends just bought a new bag at a Camden Town stall which instead fit the PSP perfectly, and he had space to make the odd buy and not worry where to stash it. I ended up buying the same model in a different colour._

Then Christmas came and I got a G Watch as a gift. I originally wanted to just redirect it to my sister — but since she’s an iPhone user that was not an option, and I ended up trying it out myself. I have to say that it’s an interesting gadget, which I wouldn’t have bought by myself but I’m actually enjoying.

The first thing you notice when starting to use it is that its main benefit is stopping you from turning on your phone display — because you almost always do it for two reasons: check the time and check your notifications, both things you can do by flicking your wrist. I wonder if this can be count as security, as I’ve been “asked the time” plenty of times around Dublin by now and I would like to avoid a repeat.

Of course during the day most of the phone’s notifications are work-related: email asking me to do something, reminders about meetings, alerts when I’m oncall, … and in that the watch is pretty useful, as you can silence the phone and rather have the watch “buzz” you by vibrating — a perfect option for the office where you don’t want to disturb everybody around you, as well as the street where the noise would make it difficult to hear the notification sounds — even more when you stashed the phone in your bag as I usually do.

But the part that surprised me the most as usefulness is using it at home — even though things got a bit trickier there as I can’t get a full coverage of the (small) apartment I rent. On the other hand, if I leave the phone on my coffee table from which I’m typing right now, I can get full coverage to the kitchen, which is what makes it so useful at home for me: I can set the timer when cooking, and I have not burnt anything since I got the watch — yes I’m terrible that way.

Before I would have to either use Google Search to set the alarm on one of the computers, or use the phone to set it — the former tends to be easily forgotten and it’s annoying to stop when focusing on a different tab/window/computer, the latter require me to unlock to set up the timer, and while Google Now on any screen should be working, it does not seem to stick for me. The watch can be enabled by a simple flick of the wrist, and respond to voice commands mostly correctly (I still make the mistake of saying «set timer to 3 minutes» which gets interpreted as «set timer 23 minutes»), and is easy to stop (just palm it off).

I also started using my phone to play Google Play Music on the Chromecast so I can control the playback from the phone itself — which is handy when I get a call or a delivery at the door, or whatever else. It does feel like living in the future if I can control whatever is playing over my audio system from a different room.

One thing that I needed to do, though, was replace the original plastic strap. The reason is very much personal but I think it might be a useful suggestion to others to know that it is a very simple procedure — in my case i just jumped into a jewelry and asked for a leather strap, half an hour later they had my watch almost ready to go, they just needed to get my measure to open the right holes in it. Unlike the G Watch R – which honestly looks much better both on pictures and in real life, in my opinion much better than the Moto 360 too, as the latter appears too round to me – the original G Watch has a standard 22mm strap connector, which makes it trivial to replace for a watch repair shop.

With the new strap, the watch is almost weightless to me, partly because the leather is lighter than the plastic, partly because it does not stick to my hair and pull me every which way. Originally I wanted a metal strap, honestly, because that’s the kind of watches I used to wear — but the metal interferes with Bluetooth reception and that’s poor already as is on my phone. It also proves a challenge for charging as most metal straps are closed loops and the cradle needs to fit in the middle of it.

Speaking of reception, I have been cursing hard about the bad reception even at my apartment — this somehow stopped the other day, and only two things happened when it improved: I changed the strap and I kicked the Pear app — mostly because it was driving me crazy as it kept buzzing me that the phone was away and back while just staying in my pocket. Since I don’t think, although I can’t exclude, that the original strap was cause for the bad reception, I decided that I’m blaming the Pear app and not have it on my phone any more. With better connectivity, better battery life came, and the watch was able to reach one and a half full days which is pretty good for it.

I’m not sure if wearables are a good choice for the future — plenty of things in the past thought they were here to stay. This is by far not the first try to make a smart watch of course, I remember those that would sync with a PC by using video interference. We’ll see what it comes down to. For the moment I’m happy for the gift I received — but I’m not sure if I would buy it myself if I had to.

(Short) Book Review: Xamarin Mobile Application Development for Android

You probably read by now that I’ve been thinking of build either an Android application or a Chrome one as either companion or replacement for the glucometer utilities which I’ve been writing in Python for the past few months.

Packt has been nice enough to let me review Xamarin Mobile Application Development for Android, and so I decided to take into consideration the option of actually building the app in C#, so that it can be shared across various platforms.

The book goes into details of what Android applications can and should do and provides nice examples, mostly around a points-of-interst application. It’s hard to say much when I don’t want to complain, so I’ll just say, give it a go, if you don’t plan to make your apps open source (which I think you should). As the book points out, being able to share your backend libraries (but not frontend/UI ones!) across operating systems and platforms (phone, tablet, computer) is a godsend, so I think Xamarin did build a very good tool for the job.

On the other hand, I’m definitely not going to pursue this — while C# is a language I like, and Xamarin for Android allows you to use JNI extensions as the one Prolific releases for their USB-to-serial adapter, I find having the tool open source is more important than any of this.

Diabetes control and its tech, take 6: multiple devices, multiple interfaces

In my previous post on the topic I complained about Abbott not being forthcoming with specifications for their devices. I’ve since then started looking into ways to sniff the protocol while keeping the device talking with the Windows software but I haven’t really had any time to proceed with the plan.

On the other hand, when I was providing another person a link to LifeScan’s full protocol specifications, they decided to tell me that my device is now completely obsolete, and they offered to send me a newer one. I filled in the form and I’m now waiting for the device itself. This means that I’ll probably be writing soon a second device driver for my glucometer utilities, for the OneTouch UltraMini. The protocol itself is quite different, as it’s this time a fully binary protocol, where dates need not as much be parsed as being formatted (they are reported as UNIX timestamps), although it loses a few features in the process, namely the comment/meal options. Not extremely important but it’s a bit sad — not that I ever used the features beside for testing.

While the new meters no longer allows you to provide your before/after meal indication, the OneTouch software does have heuristics to provide said information. I’m not sure exactly how they do that, but I would venture a guess that they use time and relative time. So if you have two measurements let’s say at 7 and 9 in the morning, it will count them as before and after breakfast, while a measurement at 11 is going to be considered as before lunch. One of my entries in the infinite TODO list is to actually implement time-based heuristics for my utilities.

Now, while my utility works fine as a CLI tool, it does not really help to keep track of diabetes over time, and does not have even a fraction of the features of LifeScan’s or Abbott’s software. One of the things that I plan on working on is a way to store the data downloaded from the meter in a local database, such as SQLite, and then show it over time with a half-decent UI. This is going to be my target for next year at least.

Also in a previous post I noted how I’d like to have an Android app to download the data on the fly. I’ve not started working on that area at all, on the other hand, I was able to secure not one but two devices supporting USB OTG. Unfortunately, neither support the PL2303 serial adapter that LifeScan uses in their cables. Since mobile devices are not my speciality, I’d love to hear from somebody who has a clue about them: does it make sense to write an userland, libusb-based implementation of the PL2303 protocol so to use it as OTG, or would time be better spent on devising a Bluetooth adapter?

On the Bluetooth adapter side, the LifeScan devices – at least the old ones but, as far as I could tell from the website, the new ones as well – use a minijack interface similar to the osmocom serial cable, but not compatible, so don’t waste your time with those cables. The OneTouch cables have Rx/Tx swapped, so that Rx is at tip, and Tx at sleeve. On the other hand if I’m not mistaken it should be possible to have a device (small or big is beside the point for development) that interface with the device as a “dock” and provides a Bluetooth serial that can work with a paired computer — what I have no idea of is whether a mobile device could use a Bluetooth connection as a serial port.

At any rate these are my current musings. When I’ll have some more details, and especially if somebody out there can give me suggestions on the direction to move to, I’ll post more. And in the mean time, if you have a glucometer, a cable, and the protocol, you can write a driver for it and send me a pull request on the GitHub repository, I’ll be happy to review and merge!

Diabetes control and its tech, take 4: glucometer utilities

This is one of the posts I lost due to the blog problems with draft autosaving. Please bear with the possibly missing pieces that I might be forgetting.

In the previous post on the subject I pointed out that thanks to a post in a forum I was able to find how to talk with the OneTouch Ultra 2 glucometer I have (the two of them) — the documentation assumes you’re using HyperTerminal on Windows and thus does not work when using either picocom or PySerial.

Since I had the documentation from LifeScan for the protocol, starting to write an utility to access the device was the obvious next step. I’ve published what I have right now on a GitHub repository and I’m going to write a bit more on it today after a month of procrastination and other tasks.

While writing the tool, I found another issue with the documentation: every single line returned by the glucometer is ending with a four-digits (hex) checksum, but the documentation does not describe how the checksum is calculated. By comparing some strings with the checksum I knew, I originally guessed it might have been what I found called “CRC16-Syck” — unfortunately that also meant that the only library implementing it was a GPL-3 one, which clashed with my idea of a loose copyleft license for the tools.

But after introducing the checksum verification, I found out that the checksum does not really match. So more looking around with Google and in forums, and I get told that the checksum is a 16-bit variation of Fletcher’s checksum calculated in 32-bit but dropping the higher half… and indeed it would then match, but when then looking at the code I found out that “32-bit fletcher reduced to 16-bit” is actually “a modulo 16-bit sum of all the bytes”. It’s the most stupid and simple checksum.

Interestingly enough, the newer glucometers from LifeScan use a completely different protocol: it’s binary-based and uses a standard CRC16 implementation.

I’ve been doing my best to design the utility in such a way that there is a workable library as well as an utility (so that a graphical interface can be built upon it), and at the same time I tried making it possible to have multiple “drivers” that implement access to the glucometer commands. The idea is that this way, if somebody knows the protocol for other devices, they can implement support without rewriting, or worse duplicating, the tool. So if you own a glucometer and want to add support for it to my tool, feel free to fork the repository on GitHub and submit a merge request with the driver.

A final note I want to leave about possible Android support. I have been keeping in mind the option of writing an Android app to be able to dump the readings on the go. Hopefully it’s still possible to build Android apps for the market in Python, but I’m not sure about it. At the same time, there is a more important problem: even though I could connect my phone (Nexus 4) to the glucometer with an USB OTG cable and the one LifeScan sent me, but the USB cable has a PL2303 and I doubt that most Android devices would support it anyway.

The other alternative I can think about is to find an userland implementation of PL2303 that lets me access it as a serial port without the need for a kernel driver. If somebody knows of any software already made to solve this problem, I’ll be happy to hear.

Diabetes control and its tech, take 2: panic buttons

So as I said in my previous post calling my diabetes problem is not really a type 2. According to the specialist I went to see this week, it’s actually much more similar to type 1, but it’s neither, and is strictly related to my pancreatitis.

Anyway the doctor put me on insulin (which I was actually expecting); this is no big deal, even though my mother started fretting as soon as she heard of it (why do I still keep her up to date? I really shouldn’t, at this point). I mean, there are kids out there managing their own insulin injection, why should I be worried about this? And it’s not like I’m scared of needles at this point.

But of course, they had to warn me about the dangers of hypoglycemia (also known as low blood sugars), and how to treat it. Since I live alone, they were even more concerned: in the (remote, given my blood sugars) chance I would have an episode of hypoglycemia, I have to rely on somebody actually checking on me. During the week it’s easy: I work at an office, so I asked my colleagues, if I don’t arrive and I’m not answering to please check on me. But what about the weekends? What about bank holidays and vacations?

Well, I started looking for an app for android to do that as that seemed to be the obvious solution to the problem. Unfortunately, I could not find anything that fits the bill as intended for me. The main problem is that most “dead man’s triggers” apps (some of which are named this way or variants thereof, are designed for a different situation: they are designed to get rid of the data on your phone if you either died or your phone got stolen or got lost, which means they got features such as password protection and wiping, but they don’t have a “call with a pre-recorded message” feature which would be much more useful to me.

Indeed, what I’m looking for is:

  • scheduler: I don’t want to have to ack the trigger during the night;
  • list of randomly-selected people to text or call;
  • ability for the person that has been called to ack/nack the request (so that somebody else can be contacted if, say, the contact is away from the city);
  • an escalation procedure so that if nobody can be reached to reach to me (and I still haven’t answered the trigger, which should keep ringing), an absolute emergency contact can be defined — in my case that would be my employer’s security office;
  • a way to provide a quick broadcast of the location of the phone, so that if I’m not home somebody can actually find me.

If I don’t quickly find a solution to this I might as well just decide to write my own, but I’d rather avoid that since I have barely the time to live, lately.

Kindle Fire and Games

Yes, there goes another post writing about my flashed Kindle Fire. If you’re bored just skip it.

When I had Amazon’s operating system I tried quite a number of games, mostly “Free apps of the day” from Amazon’s appstore, or a few free (ad-supported) games — even though I did buy Rovio’s Amazing Alex as I liked the demo quite a bit. The only game that was really unplayable on the device was Jetpack Joyride (which is free). Even the Google Play version, with CyanogenMod, stutters enough that I don’t want to play it there, while on the other hand it works perfectly fine on my iPad and iPod Touch.

Since I haven’t even tried installing the Amazon App Store after flashing CyanogenMod on the device, I haven’t played Amazing Alex in a long time. On the other hand I played Fieldrunners HD (link goes to Amazon) which I bought on Google Play instead, and played on the Desire HD before. This worked like a perfect charm (and if you like tower defense games, this is a terrific game, and you should give it a try!).

The first games I bought on the newly flashed Kindle Fire were Eve of Genesis and Dark Gate (latter link goes to Google Play), thanks to Caster’s suggestion. These are classic Japanese RPGs, likely re-made from older 8- and 16-bit systems to Android and iOS, exactly what I like for the few moments I spend playing on it. They play quite nicely, even if sometimes they do stutter as well.

But the problem starts with the most recent (at the time of writing) Humble Bundle with Android 5 which I bought in the hope to play Dungeon Defenders on the tablet at least, since my Dell laptop does not play it smoothly on Windows, and my Zenbook has an HD4000 “videocard” and with that card, there’s a bug that was not fixed yet, as far as I can tell. Ryan would know better.

Unfortunately, trying to get Dungeon Defenders to play on that tablet is a bad idea, in particular the moment when you have to load the input method to type your name, it crashes completely. Other games in the bundle are not better. Splice crashes just after loading, for instance, and so did Solar 2. While Crayon Physics works, it will complain if even a single other application is running that it doesn’t have enough memory, and it’s probably correct in that.

Among the games that works, Crayon Physics is definitely worth it — I’m going to try Sword & Sworcery EP and see if that one works as well. Dynamite Jack is not my cup of tea but works great (and it shows that it was well designed and written by the way it was faster to start up that most apps).

Of course these are only some examples, but it shows two main problems: the first is that it really is necessary to put requirements on software, and try to spare as much memory as possible without making the application unusable, if you want to be compatible; the other that if you want to create a gateway app, like Humble Bundle did, you need to make sure you check the requirements before allowing the user to install the games. In this case, the tablet is obviously not supported, as I flashed an experimental, unofficial ROM myself, but I’m pretty sure that most of the Chinese tablets that I’ll find at the local Mediaworld (Italian brand for Mediamarkt) will have even less memory than the Fire.

Oh well, hopefully I’ll soon be able top lay these games on a real gaming PC, be it with Linux or Windows, thanks to Steam, and then it won’t matter that the Fire is not that powerful.

Browsers on the Kindle Fire

A few days ago I talked about Puffin Browser with the intent to discuss into more details the situation with the browsers on the Kindle Fire tablet I’m currently using.

You might remember that at the end of last year, I decided to replace Amazon’s firmware with a CyanogenMod ROM so to get something useful on it. Beside the lack of access to Google Play, one of the problems I had with Amazon’s original firmware was that the browser that it comes with is flakey to the point of uselessness.

While Amazon’s AppStore does include many of the apps I needed or wanted – including SwiftKey Tablet which is my favourite keyboard for Android – they made it impossible to install them on their own firmware. I’ve been tempted to install their AppStore on the flashed Kindle Fire and see if they would allow me to install the apps then, it would be quite a laugh.

Unfortunately, while the CM10 firmware actually allows me to make a very good use of the device, much more than I could ever have reached with the original firmware, the browsing experience still sucks big time. I’ve currently installed a number of browsers: Android’s stock browser – with its non-compliant requests – Google Chrome, Firefox, Opera and the aforementioned Puffin. There is no real winner on the lot.

The Android browser has a terrible network implementation and takes way too much time requesting and rendering pages. Google Chrome is terrible on the whole, probably because the Fire is too underpowered to run it properly, which makes it totally useless as an app. I only keep it around for testing purposes, until I get a better Android tablet.

Firefox has the best navigation support but every time I click on a field and SwiftKey has to be brought up, it takes a full minute. Whether this is a bug in SwiftKey or Firefox, I have no idea. If someone has an idea who to complain about it to, I’d love to report it and see it fixed.

Best option you get, beside Firefox, is Opera. While slightly slower than Firefox on rendering, it does not suffer from the SwiftKey bug. I’m honestly not sure at this point if the version of Opera I’m using right now renders with their own Presto engine or with WebKit which they announced they are moving to — if it’s the latter, it’s going to be a loss for me I guess, since the two surely WebKit based browsers are not behaving nicely for me here.

Now from what I said about Puffin, you’d expect it to behave properly enough. Unfortunately that is not the case. I don’t know if it’s a problem with my local bandwidth being too limited, but in general the responsiveness is worse than Opera, although not as bad as Puffin. The end result is that even the server-side rendering does not make it usable.

More reviews of software running on the Fire will follow, I suppose, unless I decide to get a newer tablet in the next weeks.

The revenge of the artificial regions

This is the third post in a series, as it happens — part 1 and part 2 are both available.

Let’s see how am I currently set up — I’m still in Italy for less than 30 days; I have bank accounts in the US and in Italy with their associated cards, and I own four “mobile devices” — two tablets (iPad and Kindle Fire with CM10.1 so that it works), a cellphone running CM7 and an iPod Touch. The two iOS devices are associated with an American iTunes account (since that’s the only way I could buy and watch TV series in English), and thus get apps for the US region. The cellphone and the Kindle Fire are similarly associated with an account with US billing address for a little while longer, but then it seems like the Play Store restrictions apply depending on the currently-in-use SIM on the cellphone. I then have one Italian, and one US, SIMs that I can switch to — the latter does not even associate with the network because there is no roaming coverage on that contract.

This turned out quite interesting as the Starbucks application is not available with an Italian SIM, and my (Italian) bank’s application is not available with an US SIM. And this was what I complained earlier in the series.

Now I’m getting ready to move to Dublin. Among the things that I’m looking at I’ve got to understand the way the buses works… the Dublin Bus website sports a badge in the homepage that a mobile application (an App) is available on both Apple’s AppStore and on the Play Store. Unfortunately the latter (which is the one I would care about) is not compatible with any of my devices. A similar situation happened with a cab company app that a friend suggested me. Luckily it seems like getting a SIM in Ireland is quick and easy, so then I should have access to these two apps — probably losing access to some of the Italian apps I have installed.

Can somebody tell me why applications like these are limited to regions, when they are very useful for tourists, and for preparation? Sigh!