This Time Self-Hosted
dark mode light mode Search

Tidbit: Writing A PWA For Link Tagging

After having given Google Analytics another try and having given up on it (both in terms of GDPR complexity, cookie law compliance, and general AdBlock update within my readership), I have more recently started running a “first party analytics” tool called Matomo. Since Matomo does not share the data with anyone but myself, this makes things a lot safer to run with decent compliance, and since I drop any identification of individuals, I can say I respect my readers privacy with firm conviction.

But one of the important things I need out of an analytics tool is to know where people come to read my blog from. If you have written a blog yourself for any amount of time, you can probably imagine the reason why I’m saying this: if you don’t know who reads your blog and how they reach it, you will end up just talking to yourself, and while for some that’s okay, I find it frustrating.

“Back in my days” as we say, this was easy: you check the Referer (sic) header of the requests, and you group through that. Both fortunately and unfortunately, this is no longer really an option, as browsers starts defaulting to not provide that header outside of cross-origin requests. I say “fortunately” because the way Referer headers have been implemented in the past was really not a good privacy take, as it provided the full URL your blog may have been linked on. When that was visible, I sometimes could tell not only which companies relied on some of my how to information, but also the name of their internal projects that did. Just a little bit creepy.

Mastodon, in particular, defaults to provide no referrer whatsoever. This makes sense, to a point, as single- or few-user instances would be way too easy to identifying, though at the same time is making it very hard for a lot of professional or commercial actors to justify their investment in the fediverse — whether you find this a positive or a negative, it’s totally up to you. To be clear, it would have been possible for (large) Mastodon instances to use a referrer cloaking redirect, rather than opting out of referrers altogether, but it would be a trade-off with allowing the Mastodon provider themselves to identify the users that clicked on specific links and the whole thing becomes a privacy nightmare anyway.

Cue campaign tagging — the by many dreaded (if misunderstood) mtm_campaign (or utm_campaign) parameters in URLs. When used appropriately, these URL tagging parameters are not intended to track a single individual, nor a specific source, as much as a whole “investment campaign.” I have used these before, and still do now, to account for how many people reach my blog through the old URLs that predate the current domain.

Once again, at least for what concerns my blog, this is not used as any way to connect your traffic with traffic on other websites, since the data never leaves my systems: it’s just a way to know how many people care to click on my blog’s links on Mastodon, versus Bluesky, versus Threads, and so on. And that’s mostly for me to know where to spend my time — would I get more “mindshare” for my ideas if I spent all the time posting on Threads, or is Mastodon the right place to find people who care?

By the way, as it should be obvious from another post, there is always some level of tracking of reads on a WordPress installation, since you can’t seem to have an option to disable Jetpack Stats (which are actually consolidated by Automattic, even though there is no identifiable data provided to the blog’s admins through those) — except that even the “new” (now paid for) Stats are pretty much useless, in part for not distinguishing Android apps correctly (everything is reported as “WordPress Android App” unless there’s at least two different Android app names reported) and in part because there is no campaign tracking present. Which means there’s nothing but the referrer provided to go by to try to attribute the source of traffic.

I could spend a lot more time rambling on about Matomo, but I prefer waiting a few more weeks, get some useful data out of it, and then present it as part of a review later on. Instead, I want to talk a moment about something I cobbled together to make my life easier. Since often I’m posting on socials on my phone, it got annoying quickly to append the campaign tagging parameters, particularly since if you get them slightly different your statistics become useless.

At first, I nearly set myself out to learn to build an Android app I could use to share a link to a post, then append the parameters so I could copy it, and I was looking at what options existed, including the idea of using .NET again. I was actually missing the time you could clobber up the worst of the UIs with Visual Basic in a couple of hours in school! Thankfully Gerrit Niezen saved me from that destiny by pointing out that you can share links to Progressive Web Apps (PWA) which are effectively just slightly more involved websites that you can “install” on Android.

While the vast majority of the documentation around PWAs appear to be written and published by Google, it’s my understanding that these work the same across multiple vendors including Mozilla and Apple (after all, PWAs mostly resemble the web-only applications that were part of the original iPhone selling point.) And since at the end of the day I do have an Android phone, that sounded like a great idea.

To start with, there’s a document listing the requirements to be installable which refers to the web manifest documentation (also available on MDN if you want a different opinion.) This keeps things pretty easy to understand in my opinion: you need to provide some information as part of the manifest and then Bob’s your uncle. Except it turns out this document is not quite correct, although in a benign form at least: what it says is required («must include a 192px and a 512px icon») is not — both Edge and Chrome will happily install a PWA that reports icons at 192px, 500px, and vector. Possibly because the latter can scale to the other.

It also says «Be served over HTTPS» but that is also not quite correct! It should really read «Be served from a secure context.» If you’re developing a PWA locally, and serve it over localhost, that’s perfectly fine — and if you’re using Visual Studio Code to edit code running on a separate box or WSL, and use the port forwarding options to localhost, that works as well. On the other hand, this will not work if you’re using VSCode Web, and the tunnels, since fetching of the manifest will be explicitly forbidden.

The next important bit for me was to be able to use the tool offline — particularly because sometimes I might be editing a post while on the Underground and have spotty, but not completely gone, connection. Thankfully, Google tries to have me covered by having a Codelab for Offline PWAs! Unfortunately, instead of taking the route of the previous article of explaining what the requirements are and how they work, they decided to make this a “Clone this repo, copy-paste this code that we won’t explain in detail into the empty files.”

To make things more annoying, they also decided to make the sample a full NodeJS web application, making it very hard to distinguish what is a requirement for running an offline PWA and what they “fluffed up” to make it look like a complex application. And I couldn’t get a lot of details even looking around MDN.

The short version is that what you need, is registering a Service Worker, which effectively acts as a caching proxy — you give it a list of files to fetch before going offline, and once the browser tries to fetch those resources, the Service Worker will return them from the offline cache instead! Of course, that’s not really well explained in that codelab, but you can get a lot more details in this other document that tries to sell you on a complex framework to managed said service workers, despite the fact that the document itself describe a handful of possible, meaningful options for you to follow, and not much more!

Worse yet, the Codelab brings you to a cached-only strategy — which means that once the files are fetched, short of the cache expiring altogether, the PWA will not get any new versions of the resources, including any JavaScript that tries to fix bugs with the code. Most likely you would want to take the sample “network first” service worker that they describe instead.

The other thing the codelab does not make clear at all, is that the service worker file needs to be service from the root of your PWA! There is no useful indication to tell you that, but the “scope” of a service worker can only be in the same path or a path under the path of the JavaScript file that implements it. If you do not do that, when trying to send the worker offline with your app, you won’t get any cached data whatsoever.

Worse yet, if you do have a PWA with a misconfigured service worker, the only thing that will happen is that the application will show a splash screen provided by Chrome telling you that “You’re offline!” — while the JavaScript console will complain about “crbug/1173575” and show an unexpected null exception both coming from the bowel of the Chrome-supplied page. Great work, folks!

This covered getting the PWA installed and running offline, but what about the ability to share links to it? Well turns out there is a Web API for it nowadays! It’s the Web Share Target (also from MDN.) This seems very very simple to use at first, you just need to provide a path to open, and the URL parameters to pass in when sharing a target: url, text, and title. Since I was interested in sharing URLs, I thought url would be the one I needed — but that was a mistake, and turns out that Android will share URLs (even coming from Chrome itself) only as part of the text parameter. Annoying.

(I’m only noting in passing that while Google’s own document says that enctype is not required when using the GET method, Chrome complains if this parameter is not set to application/x-www-form-urlencoded explicitly.)

Unfortunately, when I tried going and fixing the JavaScript for the Android silly behaviour, I ended up spending close to an hour trying to find why it didn’t seem to fix anything. At some point I also found some Stackoverflow answer insisting that you needed an assetslink.json file that is used to attach a website to an application on the Play Store, and nearly convinced myself it was the case, since every time you tried it tried to fetch that path!

But the answer was a lot more banal: even when uninstalling and cleaning the data for the PWA, as well as clearing the site data for the site from Chrome, the files were kept cached by Android. Not sure if it was because of the aggressive Cache-Control header, or because of the Service Worker, but the only way I found to make sure it really re-fetched the fixed JavaScript was to clear the Chrome cached data.

(By the way, have you looked at your site data recently? I somehow ended up with 5GiB of site data from a Hong Kong tourist website that I visited while I was over there in August! It didn’t seem to expire!)

Finally, I needed a quick and dirty way to do what I needed: paste (or share) the URL I wanted to post, and get an easy way to copy the tagged URL for the various social networks I’m likely to go and share said link to (Mastodon, Bluesky, Threads, etc.) At first I thought of using React (among other things because it’s coming from my employer and that felt… easy) but turns out that even for an entirely client-side rendering project, the instructions get very complicated with NodeJS and various frameworks.

At the end, I opted for using Bulma as a CSS framework to build a minimal UI with a bunch of text boxes, and started from their sample application to compile the JavaScripts and SASS to static files. It works, although I’m still struggling to find a good answer for how to easily deploy this on my usual VPS. For now I literally scp-d the files onto the host that runs Caddy.

If you are interested how the whole thing looks once I jumped through the many hoops of badly connected documents, I published the repository, and if you do want to use the same Matomo tags I’m using for these, you can use or install Link Tagger. It’s literally a handful of static files being served.

Please note that I’m very clearly not a web developer, and this is the first time I do anything with NodeJS that is more than running a single npm command, and even then I wouldn’t call running two compile commands “using NodeJS.” So if you believe there is a much better way to achieve what I’m doing, I’m all ears.

(On Mastodon I also put a side rant about pre-commit and its developer. I’m going to skip that here, but as I said before, I’m not their biggest fan.)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.