Did Apple lose its advantage?

Readers of my blog for a while probably know already that I’ve been an Apple user over time. What is not obvious is that I have scaled down my (personal) Apple usage over the past two years, mostly because my habits, and partly because of Android and Linux getting better and better. One component is, though, that some of the advantages to be found when using Apple started to disappear for me.

I think that for me the start of the problems is to be found in the release of iOS 7. Beside the taste of not liking the new flashy UI, what I found is that it did not perform as well as previous releases. I think this is the same effect others have had. In particular the biggest problem with it for me had to do with the way I started using my iPad while in Ireland. Since I now have access to a high-speed connection, I started watching more content in streaming. In particular, thanks to my multiple trips to the USA over the past year, I got access to more video content on the iTunes store, so I wanted to watch some of the new TV series through it.

Turned out that for a few versions, and I mean a few months, iOS was keeping the streamed content in the cache, not accounting for it anywhere, and never cleaning it up. The result was that after streaming half a series, I would get errors telling me the iPad storage was full, but there was no way from the device itself to clear the cache. EIther you had to do a factory reset to drop off all the content of the device, or you had to use a Windows application to remove the cache files manually. Not very nice.

Another very interesting problem with the streaming the content: it can be slow. Not always but it can. One night I wanted to watch The LEGO Movie since I did not see it at the cinema. It’s not available on the Irish Netflix so I decided to rent it off iTunes. It took the iPad four hours to download it. It made no sense. And no, the connection was not hogged by something else, and running a SpeedTest from the tablet itself showed it had all the network capacity it needed.

The iPad is not, though, the only Apple device I own; I also bought an iPod Touch back in LA when my Classic died. even though I was not really happy with downgrading from 80G down to 64G. But it’s mostly okay, as my main use for the iPod is to listen to audiobooks and podcasts when I sleep — which recently I have been doing through Creative D80 Bluetooth speakers, which are honestly not great but at least don’t force me to wear earphones all night long.

I had no problem before switching the iPod from one computer to the next, as I moved from iMac to a Windows disk for my laptop. When I decided to just use iTunes on the one Windows desktop I keep around (mostly to play games), then a few things stopped working as intended. It might have been related to me dropping the iTunes Match subscription, but I’m not sure about that. But what happens is that only a single track for each of the albums was being copied on the iPod and nothing else.

I tried factory reset, cable and wireless sync, I tried deleting the iTunes data on my computer to force it to figure out the iPod is new, and the current situation I’m in is only partially working: the audiobooks have been synced, but without cover art and without the playlists — some of the audiobooks I have are part of a series, or are split in multiple files if I bought them before Audible started providing single-file downloads. This is of course not very good when the audio only lasts three hours, and then I start having nightmares.

It does not help that I can’t listen to my audiobooks with VLC for Android because it thinks that the chapter art is a video stream, and thus puts the stream to pause as soon as I turn off the screen. I should probably write a separate rant about the lack of proper audiobooks tools for Android. Audible has an app, but it does not allow you to sideload audiobooks (i.e. stuff I ripped from my original CDs, or that I bought on iTunes), nor it allows you to build a playlist of books, say for all the books in a series.

As I write this, I asked iTunes again to sync all the music to my iPod Touch as 128kbps AAC files (as otherwise it does not fit into the device); iTunes is now copying 624 files; I’m sure my collection contains more than 600 albums — and I would venture to say more than half I have in physical media. Mostly because no store allows me to buy metal in FLAC or ALAC. And before somebody suggests Jamendo or other similar services: yes, great, I actually bought lots of Jazz on Magnatune before it became a subscription service and I loved it, but that is not a replacement for mainstream content. Also, Magnatune has terrible security practices, don’t use it.

Sorry Apple, but given these small-but-not-so-small issues with your software recently, I’m not going to buy any more devices from you. If any of the two devices I have fails, I’ll just get someone to build a decent audiobook software for me one way or the other…

Again on mobile phones protection

After my previous post on the matter I’ve found out that, first of all, iOS does support complex passwords, and second I had an experience that strengthened my feeling that Apple’s move toward TouchID is a good move.

So around 24 hours later, I guess the shock itself is wearing off, even though the scene is still extremely fuzzy in my mind.

Last night, after saying bye to [Andrea] and [Fabio] I was trying to get a cab on the South Side, and after two drivers calling to say that they couldn’t pick me up (so why on earth did you accept the ride on [HAILO] ?) I decided to cross the river — on the next pedestrian bridge next to O’Connell (Ha’Penny).

Just before finishing crossing, some scumbag yanks my phone out of my hands (while I was calling another cab). I should have just left them the phone and called Security to have it locked & tracked down, but I got into fight-or-flight mode and, as it turns out, in particular in fight mode.

I run after the guy, who was trying to cross Ormond Quay, but thanks to him trying to avoid getting driven over I catch and grab him by the chest. He drops my phone, not sure if hoping I’d let go or because he struggled; his partner then punches me in the face and screamsfor me to let him go, my glasses and my hat fall on the sidewalk, and the two guys run away.

I pick up the glasses, put my hat back on and check on the phone, it’s ringing, it’s the cab driver. One passer by actually stops to ask me if I’m okay and if I got my phone, I’m afraid I ended up being rude to him, but I was quite shocked. The cab driver has been the most understanding, I walked away from him instead of toward, but he caught up with me, and got me safely back home. I probably should have reported this but at the time I couldn’t think, and now it would be useless.

Speaking with Security tonight I realized how stupid it has been of me to run after the guy, I should have just turned back, asked Andrea to call them to pick me up and track the phone. They could have had a knife, a rock or even just any other blunt object.

I got lucky, again… it’s not the first time, I sure hope it’s not going to be the last time (although I’d like not to need to be lucky). But sure I don’t want to stray to the North Side too often.

There is no need to tell me I was totally stupid and irresponsible, I know that. On the other hand, I can say now that I’m happy Apple decided to address the phone theft problem in a non-obvious way.

No, TouchID is not better than a PIN. No, it does not resist against even shallow targeted attacks. No, it does not protect you against police forces — why should it?

But it’s more convenient than a PIN, and people who wouldn’t even use a PIN (let alone a stronger password) because of the inconvenience are likely to rather consider using TouchID. And while again this will not protect you against self-indictment (again, why should it? — yes if it wasn’t clear enough, I’m usually trusting the police more than your average paranoid), the standard city thief wouldn’t have much use of a locked phone, beside as parts.

As long as the news goes around that phones can’t be unlocked and their value on the black market goes down, the amount of thieveries will go down.

So instead of blaming Apple for not addressing your concerns of a paranoid geek (concerns that, at this point, were addressed a long time ago and the solution for was not invalidated), think about what they are really trying to solve.

Apple’s TouchID — A story of security, or convenience?

Everybody today seems to be either panicking or screaming murder at Apple because of the “revelation” by the CCC that TouchID – the new fingerprint-scanning technology in the iPhone 5S – is extremely easy to bypass. I find this both non-news and actually quite boring.

So first of all, what is this about? Well, basically it’s possible to lift someone’s fingerprint out of a glass or something, and then use that to reproduce a copy of the fingerprint, and use that to unlock the phone. I would argue that it’s probably possible to lift the fingerprint out of the phone itself, if you really want.

Why am I not excited by this method like it was a new discovery? Simple, because MythBusters used the same idea back in 2006 to work around a fingerprint-based lock. And even at that time it turns out that the fingerprint scanner from the lock, which was actual physical security, was less picky than the one from an USB device. Not surprising, as it looks like the lock only had an optical scanner.

Please don’t get me wrong, CCC did the right thing, it’s just that I don’t think it’s a new technique as some people try to paint it.

So, if TouchID is this easy to bypass, is it a completely useless move from Apple? Or, as some paranoids seem to tell it, is it a willing move from Apple to make their users less secure so that governmental agencies can more easily get data out of phones? Well, one thing is for sure: it’s not a more secure method than the PIN lock that has been available up to now.

On the other hand I’m not that quick to ascribe all of this to malice, as many do. Nor to incompetence. The problem is that the choices are not between PIN and TouchID — the choices are between PIN, TouchID and absolutely nothing, and a lot of people have been decider for the latter, because of the trouble into putting in a 4-digits PIN every time you want to use the phone. Yes I know, and most of you readers know, that an unlocked phone is an idea that goes into the absolutely stupid, but most people use iPhone because they want something that does not get in your way, as Android can easily do.

*I don’t use an iPhone, although I do have an iPad, which I use less and less, and an iPod Touch by which I swear. I need the flexibility of Android.*

Security conscious people are unlikely to move away from PIN – so their security is not going to be compromised, although I would have liked more than 4 digits – but people who were not using a PIN before, because too inconvenient, are more likely to use TouchID now. Which improve their general privacy.

A similar concept comes up if you look into passwords management: using a password manager/wallet is an option but you still have to come up with passwords. What most people realistically do, is to use always the same password, because it’s convenient. And extremely insecure.

On the other hand you have solutions like (SuperGenPass)[http://supergenpass.com/] that generates passwords out of a master password and the domain name. This is the solution that a colleague of mine suggested to me and that I’ve been using now for a while. It’s still not perfect security: if an attacker gets a hold of hashes and can get to the password through rainbow-table, it’s still possible to recover the master password.. it’s much harder for the attacker in that case since you need multiple rainbow tables. And that’s supposing that they can identify the SuperGenPass users at all.

Here’s it what it boils down to: will TouchID make it so inconvenient to iPhone thieves on the street to try taking your phone on the go, compared to no PIN locking at all? Yes, most likely. Which basically means that its target was reached. Will it prevent sophisticated thievery, or more targeted attacks? No, but a 4-digits PIN is unlikely to be much better, as you have just so many combinations.

Apple’s Biggest Screwup — My favourite contender

Many people have written about Apple’s screwups in the past — recently, the iPad Mini seems to be the major focus for about anybody, and I can see why. While I don’t want to argue I know their worst secret, I’m definitely going to show you one contender that could actually be fit to be called Apple’s Biggest Screwup: OS X Server.

Okay, those who read my blog daily already know that because I blogged this last night:

OS X Server combines UNIX’s friendliness, with Windows’s remote management capabilities, Solaris’s hardware support, and AIX software availability.

So let’s try to dissect this.

UNIX friendliness. This can be argued both positively and negatively — we all know that UNIX in general is not very friendly (I’m using the trademarked name because OS X is actually using FreeBSD which is UNIX), but it’s friendlier to have an UNIX server than a Windows one. So if you want to argue it negatively, you don’t have all the Windows-style point-and-click tools for every possible service. If you want to argue it positively, you’re still running solid (to a point) software such as Apache, BIND, and so on.

Windows’s remote management capabilities. This is an extremely interesting point. While, as I just said, OS X Server provides you with BIND as DNS server, you’re supposed not to edit the files by hand but leave it to Apple’s answer to the Microsoft Management Console — ServerAdmin. Unfortunately, doing so remotely is hard.

Yes because even though it’s supposed to be usable from a remote host, it requires that you’re using the same version on both sides, and that is impractical if your server is running 10.6, and your only client at hand is updated to 10.8. So this option has to be dropped entirely in most cases — you don’t want to keep updating to the latest OS your server, but you do so for your client, especially if you’re doing development on said client. Whoops.

So can you launch it through an SSH session? Of course not, because despite all people complaining about X11, the X protocol, and SSH X11 forwarding, are a godsend for remote management, if you have things like very old versions of libvirt and friends, or some other tool that can only be executed on a graphical environment, you only need another X with an SSH client and you’re done.

Okay so what can you do? Well, the first option would be to do it locally on the box, but that’s not possible, so the second best would be to use one of the many remote desktop techniques — OS X Server comes with Apple’s Remote Desktop server by default. While this is using the VNC standard 5900 port… it seems like it does not work with a standard VNC client such as KRDC. You really need Apple’s Remote Desktop Client, which is a paid-for proprietary app. Of course you can set up one of many third party apps to connect to it, but if you didn’t think about that when installing the server, you’re basically stuck.

And I’m pretty sure that this does not limit itself to the DNS server, but Apache, and other servers, will probably have the same issues.

Solaris’s hardware support. This should be easy, if you ever tried to run Solaris on real hardware, rather than just virtualized – and even then … – you know that it’s extremely picky. Last time I tried it, it wouldn’t run on a system with SATA drives, to give you an idea.

What hardware can OS X Server run on? Obviously, only Apple hardware. If you’re talking about a server, you have to remove from the equation all their laptops, obviously. If it’s a local server you could use an iMac, but the problem I’ve got is that it’s used not locally but at a co-location. The XServe, which was the original host for OS X Server, is now gone forever, and that leaves us with only two choices: Mac Pro and Mac Mini. Which are the only ones that are sold with that version of OS X anyway.

The former hasn’t been updated in quite a long time. It’s quite bulky to put at a co-location, even though I’ve seen enough messy racks to know that somebody could actually think about bringing it there. The latter actually just recently got an update that makes it sort of interesting, by giving you a two-HDDs option…

But you still get a system that has 2.5”, 5400 RPM disks at most, with no RAID, and that’s telling you to use external storage if you need anything difference. And since this is a server edition, it comes with no mouse or keyboard, just adding those means adding another $120. Tell me again why would anybody in their sane mind use one of those for a server? And no don’t remind that I could have an answer on the tip of my tongue.

For those who might object that you can fit two Mac Minis on 1U – you really can’t, you need a tray and you end up using 2U most of the time anyway – you can easily use something like SuperMicro’s Twins that fits two completely independent nodes on a single 1U chassis. And the price is not really different.

The model I linked is quote, googling, at around eighteen hundreds dollars ($1800); add $400 for four 1TB hard disks (WD Caviar Black, that’s their going price as I ordered, since last April, eight of them already — four for Excelsior, four for work), you get to $2200 — two Apple Mac Minis? $2234, with mouse and keyboard that you need (the Twin system has IPMI support and remote KVM, so you don’t need them).

AIX’s software availability. So yes, you can have MacPorts, or Gentoo Prefix, or Fink, or probably a number of other similar projects. The same is probably true for AIX. How much software is actually tested on OS X Server? Probably not much. While Gentoo Prefix and MacPorts cover most of the basic utilities you’d use on your UNIX workstation, I doubt that you’ll find the complete software coverage that you currently find for Linux, and that’s often enough a dealbreaker.

For example, I happen to have these two Apple servers (don’t ask!). How do I monitor them? Neither Munin nor NRPE are easy to set up on OS X so they are yet unmonitored, and I’m not sure if I’ll ever actually monitor them. I’d honestly replace them just for the sake of not having to deal with OS X Server anymore, but it’s not my call.

I think Apple did quite a feat, to make me think that our crappy HP servers are not the worst out there…

Archiving

One of the requests at the past VDD when I shown some VLC developers my Ruby-Elf toolsuite was for it to access archive files directly. This has been in my wishlist as well for a while, so I decided to start working on it. To be precise, I started writing the parser (and actually wrote almost all of it!) on the Eurostar that was bringing me to London from Paris.

Now you have to understand that while the Wikipedia page is a quite good source of documentation for the format itself, it’s not exactly complete (it doesn’t really have to be). But it’s mostly to the point: there are currently two main variants for ar files: the GNU and the BSD variants. And the only real difference between the two is in the way long filenames are handled. And with long filenames I mean filenames that are long at least 16 characters.

The GNU format handles this with a single index file that provides the names for all the files, and provide you with an offset instead of the proper name in the header data, whereas the BSD format provides you with a length, and prepend the filename to the actual file data. Which of the two options should be the best is well up for debate.

I already knew of the difference, so I did code in support for both variants, but of course while on the train I only had access to the GNU version of ar which is present in Binutils, so I only wrote a test for that. Now that I’m back at the office (temporarily in Los Angeles, as it seems like I’ll be moving to London soon enough), I have a Mac to my side and I decided to prepare the files for testing with its ar(1) which is supposedly BSD.

I say supposedly because something strange happened! The long filename code is hit by two of my testcases: one is using an actual object file which happens to have a name longer than 16 characters, the other is an explicit long (very long) filename — 86 characters! But what happens is that Apple’s version of ar writes the filename as 88 characters, padding it with two final null bytes. At first I thought I got something wrong in the format, but if I use bsdtar on Linux, which provide among other formats support for the bsd ar format, it writes down properly 86 bytes without any kind of null termination.

More interestingly, the other archive, where the filename is just 20 characters long, is written the exact same way by both libarchive and Apple’s ar!

Artificial Regions Redux

It’s now over two months ago that I landed in the US with the idea of doing my job, do it well, and then consider moving here if the job was right. And two months ago I wrote about some stupid limitations of services based on where you were when you registered.

Now, even though I’m not here stable yet, I’m getting there: I have a bank account and a check card, and I have some billing address that I can use. So finally for instance I got access to Amazon’s App Store, which is not enabled even if you’re paying for Amazon Prime, as long as you don’t set your primary form of payment to a credit card (and address) in the US.

This should be easy, shouldn’t it? Not really; as it turns out, once I switched that around, Amazon stopped letting me buy Italian Kindle books…. which sounds silly given that they let me buy them before, and I haven’t removed my Italian credit cards, just not set them as default! Furthermore I’m not stopped from accessing them if I had them before.

The absurdities don’t stop here though; since I now have a check card in the US, I moved my iTunes Store account over… this actually enabled a few more functionalities, such as the “iTunes in the Cloud” and the fact that I can now re-download my purchased music as well as Books and Apps (which is the only two items that can be re-downloaded in Italy), but on the other hand, it threw off the previous purchases, showing all my purchased Apps as not available. While I was neither expecting nor hoping that my previous music purchases were available, I was pissed by the fact that it asked me to purchase again the software, especially things like TeamViewer, which is quite expensive. Luckily Apple’s tech support solved the issue relatively quickly.

So there you move to Android Market Google Play, that actually enabled me access to the US software simply by popping in the AT&T SIM card… well, while they did enable access to the US software, they still thought better to keep me off the Google Play Music store, as I was still registered in Italy. And while at it, when I actually purchased an App there… it ended up being charged in euros instead of dollars — this might sound strange, but it means that you pay more for it simply because the bank is going to ask you extra money for the currency exchange. Technically, the MII should tell them which currency the card is using by default, but instead of relying on that, they rely on your billing address… which they also don’t validate against the bank (as Newegg does instead).

Oh well… at least things seem to be more or less sane by now: most of the Italian books I had in my Amazon wishlist are available through the publishers’ group webshop which also provide most of them without DRM. Looks like Amazon is making it much nicer for everybody to buy eBooks now. Not all of them of course, but it’s still a step in the right direction.. and at the same time I’m very happy with buying them on the Kindle if I’m on the go, as I’m sure they are not going to kick me in my balls like Kobo did with The Salmon of Doubt (which I’m currently reading, after buying it again).

How you can tell you’re dealing with a bunch of fanboys

In my previous post where I criticised Linus’s choice of bumping the kernel’s version to 3 without thinking through the kind of problems we, as distributors, would have faced with broken build systems that rely on the output of uname command, I expected mixed reactions, but mostly I thought it would have brought in technical arguments.

Turns out that the first comment was actually in support of the breakage for the sake of finding bugs, while another (the last at the time of writing), shows the presence of what, undeniably, is a fanboy. A Linux (or Linus) one at it, but still a fanboy. And yes, there are other kinds of fanboys, beside Apple’s. And of the two comments, the former is the one I actually respect.

So how do you spot fanboy’s of all trades? Well, first look for people who stick with one product, or one manufacturer. Be it Apple, Lenovo, Dell, or in the case of software, Canonical, Free Software Foundation, KDE or Linus himself, sticking with a single supplier without even opening to the idea that others have done something good is an obvious sign of being a fanboy.

Now, it is true that I don’t like having things of many different vendors as they tend to work better together when they are from the same, but that’s not to say I can’t tell what else is good from another vendor. For instance, after two Apple laptops and an iMac, I didn’t have to stay with Apple… I decided to get a Dell, and that’s what I’m using right now. Similarly, even though I liked Nokia’s phone, my last two phones were a Motorola and, nowadays, an HTC.

Then make sure to notice whether they can’t accept flaws in the product or decisions. Indeed one of the most obnoxious behaviours in Apple’s fanboys, who tend to justify all the choices of the company as something done right. Well, here is the catch: not all of them are! Now, part of this is underscored in the next tract, but it is important to understand that for a fanboy even what would be a commercial failure, able to bring a company near bankruptcy, is a perfect move, and was just misunderstood by the market.

Again, this is not limited to Apple fanboys; it shouldn’t be so difficult to identify a long list of Nokia fanboys who keep supporting their multi-headed workforce investment strategy of maintaining a number of parallel operating systems and classes of devices, in spite of a negative market response… and I’m talking about those who are not to gain directly from said strategy — I’m not expecting the people being laid off, or those whose tasks are to be reassigned from their favourite job, to be unsupportive of said strategy of course.

But while they are so defensive of their love affair, fanboys also can’t see anything good in what their competitors do. And this is unfortunately way too common in the land of Free Software supporters: for them Sony is always evil, Microsoft never does anything good, Apple is only out to make crappy designs, and so on.

This is probably the most problematic situation: since you can’t accept that the other manufacturers (or the other products) have some good sides to them, you will not consider improvements in the same way. This is why just saying that anybody claiming Apple did something good is a fanboy is counterproductive: let’s look at what they do right, even if it’s not what we want (they are after all making decisions based on their general strategy, that is certainly different from the Free Software general strategy).

And finally, you’re either with them or against them. Which is what the comment that sprouted the discussion shows. You’re either accepting their exact philosophy or you’re an enemy, just an enemy. In this case, I just had to suggest that Linus’s decision was made without thinking of our (distributors) side, and I became an enemy who should use some other projects.

With all this on the table, can you avoid becoming a fanboy yourself? I’m always striving to make sure I avoid that, I’m afraid many people don’t seem to accept that.

Automator and AppleScript: the Bad and the Ugly

As you see, I didn’t find any good here. This is going to be a bit of a personal rant about Apple’s technologies, so if you don’t care about Apple at all, and you subscribe to the idea that you don’t have to know what your competitors are up to, then you’re suggested not to read the post.

Situation — quite a long one, so feel free to skip.

If you follow my blog with stability you probably remember the print shop for which I had to work around NATs and which pays me about a fifth of what I’m worth — mostly because I still consider them friends. A couple of months ago they moved from their original telco and ISP (FastWeb), to the ex-monopolist telco and ISP (Telecom Italia), with quite a bit of relief from my side, as that meant dropping the whole charade of IPv6 over miredo, and just using DMZ to connected to the jumphost without caring about IPv6 dynamic hosts (which I actually had to resurrect for another customer, but let’s leave it at that).

I have kept warning my customer that they’d lose their fax number if they switched telco; but when the marketeer who sold them on the move explicitly assured them (twice) that both the fax and the phone numbers would be recovered, they decided to ignore the fact that I’ve been working in their best interest for the past few years. Turns out that I knew better: the fax number was dismissed by the original telco and lost in the transition. And faxes are (unfortunately) still a big deal in Italy.

Thankfully, I’ve been ready: a friend of mine operates a virtual telco with VoIP phone and fax lines, and I knew he had a decent service of fax to mail, and fax by mail gateway. I referred them to him quickly, and a new fax line was easily established. Or maybe not so easily. While receiving fax worked out of the box, sending them turned out to be a bit more complicated: only one PDF document at a time could be faxed, which was never an issue for me, but, most worrisome, the PDF documents produced by their (industrial) printer, a Xerox DocuPrint 700, caused the GhostScript used by the server for the conversion, to crash.

Given that fixing this on the server side is non-trivial (the crash is reproducible but it’s a very convoluted situation that causes it to happen, and Ubuntu Server has no updates for the ghostscript package), I had to find an alternative route to solve the problem. Given I’m a developer and scripter, the solution was obvious: I wanted to script the conversion of the files to something easier for GhostScript to digest, and while I was at it simplify the whole procedure by just asking the secretary for the destination fax number and the files she wanted to send.

Introducing the Bad and the Ugly

Apple provides two main technologies/techniques to script their Mac environments: the “old-fashioned” AppleScript, which is an “usual” scripting language, with the difference of being very well integrated, having a relatively natural language syntax, and being byte-compilable into applications; and since Tiger a shiny, GUI-based scripting system called Automator.

I had already a brief introduction to the former, but I needed something more to write the kind of script I wanted to provide to my customer, so I made good use of the Safari Books Online trial (I’m almost certainly going to subscribe, as it happens) and looked up a couple of titles on the topic: Learn AppleScript by Sanderson and Rosenthal, and Apple Automator with AppleScript by Myer. Then I dug into the topic for the whole afternoon.

Writing the first draft of the script was quite easy, but then I remembered that the only reason why I have GhostScript available on my laptop was because I used fink and other package managers there. The iMac used by the secretary wouldn’t have those at all (and I don’t intend on installing them just for this); plus there is no free distribution of GhostScript prebuilt for Mac, so I started looking at alternative approach.

A quick googling around can tell you that while there isn’t a direct scriptable application that can be used to merge and convert PDF documents, there is an Automator action to do so, so I started looking into that technology as well. The idea behind it is extremely cool: instead of writing code by the syntax, it allows you to build logic trains of actions with input and output. The basis for what I was needing, was actually easily described with a simple train: get the user to choose some files, transform all of them in PDF if they aren’t already; merge all of them in a single document, create a new email message with that file attached, and send it.

Of course it’s not as straightforward as it might appear at a first glance: the email needs to be fired at a particular address, for the fax to be sent, which depends on the fax itself; the produced merged PDF document needs to be stored in a temporary location and removed after the mail is properly sent, and so on. To make the tool more useful, Apple also provides support for both default and custom variables, all of it designed with drag’n’drop.

Since you probably see now that Automator at least appears cool, it is obviously not the Ugly of the story, but rather the Bad. While Automator allows to combine a series of trains, by saving results in variables and then retrieving them, it does not allow multiple inputs to an action: those strictly need to be passed as variable. But not all fields of an action can be set with a variable! Interestingly enough, you cannot set the name of the merged PDF document through a variable, although you can change the path it is generated in that way. You also cannot choose the destination of an email message through a variable, but you can change the subject. Oh and while you can choose which files to select, you have no way to restrict the selection to a given type of documents. It’s ludicrous!

So while Automator could possibly be helpful for the few people who have an idea of how (at a high level) applications work, but still have no clue about implementing one. I can’t think of many people like that, to be honest. On the other hand AppleScript is a powerful tool, which enabled me to complete the task I had to take care of quite nicely. Apple’s design of Applets and Droplets, with the ability to combine the two is one of the nicest things I have seen in a very long time. Unfortunately, it is just Ugly.

Don’t get me wrong, it’s not the syntax itself that is ugly; it actually reminds me enough of Ruby to actually be enjoyable to write. And the basic commands are easy to grasp even for one who’s not a hardcore programmer. The problem is that from a clean design that pre-dated OS X, the current situation is .. messy. Files are accessed by default using the so-called HFS path, which looks something like Macintosh HD:Users:flame:something even though the OS is now almost entirely written with Unix paths in mind; if you wish to call into the Unix toolset (which I had to, to combine the PDFs, as I’ll tell), you got to convert this to the Unix-style path.

This by itself wouldn’t be that much of a limitation; a bother yes, but not a limitation as such. What makes it difficult to deal with is that even some of Apple’s own applications expect paths to be passed Unix style! This is the case for instance of Apple Mail 3.0, which made me waste two hours guessing why oh why I could ask Preview to open the file, but I was still unable to attach it to the outgoing mail message.

The ugliest part refers to merging the PDF files; as I said, using GhostScript was easy, but having GhostScript installed isn’t, so I avoided that solution. One obvious method would have been to ask Preview to merge the PDF files, given that it can do that. But Apple didn’t make Preview scriptable by default, and I wished to avoid tinkering with settings I would most likely forget about when (not if) I’d deploy the script again somewhere else.

Turns out that the implementation for Automator’s merge action boils down to a single Python script, which can easily be launched separately (Apple even provided usage documentation!). So at the end, the conversion is done through that, even though it means that I have to call into the Unix shell to complete the task, which takes a bit of time.

This all together makes AppleScript definitely the Ugly. But at least it’s something, I guess.

I remember KDE 3 having some support for this through kdialog and DCOP, I wonder how the situation is now; as far as I can tell, even though dbus should provide an even more complete interfacing to applications, it isn’t as easy to integrate workflows without being a developer, on GNOME 2. I wonder how GNOME 3 is dealing with this.

AirPrint in Gentoo

I’m interrupting the series of posts about gold to post about something I was working on as part of a different job. More about gold will resume next week.

For one of my customers, I’m developing a private web application that would probably make most Free Software enthusiasts – not just the advocates – cringe. It is developed in ASP.NET, with a SQLServer backend, and is targeted at iPad users. While in general this looks like a bad combination for a Free Software developer, paid work is paid work, and even working on Windows has its upsides, namely learning about alternative approaches and the good and bad things about them, to make your software better — to be honest I’m not sure I dislike ASP.NET more than I disliked Hobo last year.

Putting the iPad to good use is also easy: I’m reading even those O’Reilly books that were previously cumbersome to read, PDFs like CJKV Information Processing which I bought last year and never went around to read, because of not being available in ePub format.

Let me say that, as a toy, the iPad is far from bad: Apple’s hands-on approach can be criticized, but the results for a naïve, jaded user are almost near perfection. On the other hand, I don’t think I’d trade my Milestone for an iPhone any time soon, even with all its troubles — the most bothersome issue is due to my CyanogenMod installation where the “hold call” button is pressed by my cheek while talking; it seems like I’m not alone and I’m now trying the method reported there to see if it helps.

While they really seem to have an app for everything, and their not-really-multitasking approach appears to work better than what I’m used to on Android, it is rough on the edges for the meddlers: I can’t install additional certificates (I can on Android); I can’t use an email address different than the GMail login (I can on Android); it doesn’t sync contacts on Google Contacts on the fly (Android does), and so on a few more “minor” things that I just love, in my Android.

But there is also one thing that the iPad can do and my Android can’t: printing. Of course it should be easy to do, especially given that Linux’s printing system is the same CUPS used by Apple – and actually, developed, given they bought the original developers! – but that’s not really the way they decided to do it. In a very environment unfriendly way, Apple decided that you can’t just use any printer with their iPad, you need to buy a new one, as they will not provide the AirPrint interface with older models.

You can guess that people already found out how to avoid that. I had saved a long time ago a post by Eric Sandeen on the topic, which later points to a GitHub project with a script that generates a service file for Avahi, which can then be used to advertise the printer properly.

But even with this it is far from being straightforward to deal with it. First problem I found is that the script needs the cups Python extension; this is not provided by net-print/cups[python] as I first expected – I since updated the USE flags’ description in the metadata.xml file: the interpreter flags are used for cups’s own CGI scripting support, which is definitely non-obvious – but rather by the dev-python/pycups package.

The next problem is that, because of technical limitations on the size of TXT records used in mDNS discovery – if you didn’t look up post and projects I have listed above, the AirPrint protocol is actually just the usual IPP as provided by cups, wrapped through mDNS with a custom record for exposing the printer’s features – some of the printer’s options are not exposed. One of those that is ignored is duplexing, but since my only real reason to print from the iPad is printing invoices, which I always print duplex – don’t get me started on why I should be printing invoices – I wanted to expose it. Luckily the generated Avahi service file is easy to fiddle with.

Next up, actually trying to print. Unfortunately there is one thing that could be a bit difficult to guess: CUPS validates the Host header passed (IPP is based on HTTP/1.1). By default, if you don’t fiddle with its configuration, what it looks for is the same hostname as set in the system, but what gets sent by the iPad is what it has found through mDNS/Avahi, which is the base host name followed by the .local domain. In my case that meant CUPS was expecting deepspace9.home.flameeyes.eu and it got deepspace9.local. Since the CUPS configuration is heavily inspired by Apache, just adding ServerName and ServerAlias should be enough… but it’s worth noting that when accessing CUPS through a browser, especially with SSL enabled, it will send the port number as well, so you should alias both hostname and hostname:631.

The takeaway of this all? Well, sometimes proprietary software solutions are not any better than Free Software — not that there is a good way to handle that for us as well, but it still shows its fragile support even using the same server software as the protocol’s creator.

An year with my Reader

Okay so it’s not a full year in any sense – I bought it way over an year ago, and I found out that it could be used with a modern technology like ePub just past April – but if I have to remember one as such, 2010 has been the year of the Reader for me. And not just for me as it happens.

First of all, thanks to the Reader I was able to read a whole lot more than in the past years; I’m not sure if it’s just novelty that will wear off, but I’m pretty sure I wouldn’t have brought with me so many books to read during travels as I did, mostly because of the small form factor (and the fact that it fits neatly into my usual bag). Anobii statistics reports I read 31 books, ten thousands pages worth of content this year — and this is nothing to say about the sheer variety of them compared to the past.

While I was never limiting my readings to a particular genre, the Reader, with the much cheaper ebooks, allowed me to choose among a wider range of books for my readings. Also the convenience of getting the book right away is not something to ignore; I actually read through Cyber War mostly because I could get it right after hearing about it on Real Time with Bill Maher. Beside that particular book, I went on to read some classics like The Picture of Dorian Gray that I never found the time to look up before, and economics/business books such as Too Big To Fail and Free which actually interested me greatly.

Surprisingly, what I found most difficult to read on the Reader were the reason I originally looked back to the Reader: O’Reilly books. Since they are generated with DocBook, they have one big issue that makes them hard to read on these devices, they split too much. Let’s take for instance Being Geek which I’d like to read next, if I can find a way to do so without irks; on a PDF or print edition, there are page breaks only between “sections”, rather than chapters. Chapters, which are actually often enough just a couple of paragraphs long, are simply printed one after the other continuously; this is quite okay because otherwise, the padding added at the end of each would waste a lot of paper, and would transform a 200ish pages’ book into a 500ish or so. As I said, DocBook ePub generation is imperfect in this regard as it splits the output HTML files (yes it uses HTML internally, let’s move on) on chapter markers, which means that every three paragraphs I have to wait till the Reader fetches the next file to render separately, slowing my reading down enough to make it difficult to continue.

Reading the PDF version of books on the reader is also not the brightest idea; since the screen of the PRS-505 is relatively small, you can’t read a PDF file full-size; while the newest firmware allows to zoom and reflow the text, this becomes also unusable as a way to read O’Reilyl books because the page number marker is not ignored. Even worse when complex diagrams are involved as the Reader is pretty much useless for those — for those technical books, I probably wouldn’t mind a tablet with a bigger screen; I’ve been considering the Archos 101 but I don’t currently have the dough to afford one; and when I’ll have, they’ll probably be sold out already.

Speaking about tablets, once again I think that Apple, even though can’t really be praised for a tremendously good job with the iPad, had a primary role in making 2010 the year of eBooks not only for me but for the whole situation, together with Amazon — the latter finally launched the Kindle in Europe (and once again, it’s not something I’d buy, considering Amazon’s “it’s all ours!” approach). With those two companies driving consumer attention (even though rarely consumers themselves) toward eBooks, I was somewhat curious about the Italian branches of Mediamarkt and Saturn starting to carry eInk devices in-store, especially since I knew there were no real Italian eBook pool to draw from, for the customers buying the devices.

Turns out that while Amazon entered the Italian market, IBS, that has been many times considered the Italian answer to Amazon, and Mediamarkt itself opened eBook stores, carrying Italian content in ePub format (mostly locked with Adobe Digital Editions DRM). I’m happy to note that while backcatalog is still not available, at least they carry both the big take-it-all publisher Mondadori and the lesser Chiarelettere with its vastly politics books — especially nice since the average books from the latter I bought before were both pretty expensive, and quite huge in the pure physical dimension).

At any rate, the bottom line for me is that the Reader now looks like a pretty good buy, more than it ever did at the time. But please, make it possible to skip over wireless, 3g, bluetooth, touchscreen.. the two weeks charge that the PRS-505 both promises and delivers make all of those look like wastes, especially since I only end up loading new stuff on it once every two weeks, which is also the time I end up charging its battery.