After The Streams — Conclusion From My Pawsome Players Experiment

A few weeks ago I announced my intention to take part in the Cats Protection fundraiser Pawsome Players. I followed through with seven daily streams on Twitch (which you can find archived on YouTube). I thought I would at least write some words about the experience, and to draw some lines out of what worked, and what didn’t, and what to expect in the future.

But before I drop into dissecting the stream, I wanted to thank those who joined me and donated. We reached £290 worth of donations for Cats Protection, which is no small feat. Thank you, all!

Motivations

There’s two separate motivations to look at when talking about this. There’s my motivation for having a fundraiser for Cats Protection, and then the motivations for me doing streams at all, and those needs to be separated right away.

For what concern the choice of charity – both me and my wife love cats and kittens, we’re childfree cat people. The week happened to culminate in my wife’s birthday and so in a way it was part of my present for her. In addition to that, I’m honestly scared for the kittens that were adopted at the beginning of the lockdown and might now be left abandoned as the lockdown eases.

While adopting a kitten is an awesome thing for humans to do, it is also a commitment. I am afraid for those who might not be able to take this commitment to full heart, and might find themselves abandoning their furry family member once travel results and they are no longer stuck at home for months on end.

I also think that Cats Protection, like most (though not all) software non-profit organizations, are perfectly reasonable charities to receive disposable funds. Not to diminish the importance and effort of fundraisers and donations to bigger, important causes, but it does raise my eyebrow when I see that NHS needs charitable contributions to be funded — that’s a task that I expect the government taking my tax money should be looking at!

Then there’s the motivation for me doing livestreams at all — it’s not like I’m a particularly entertaining host or that I have ever considered a career in entertainment. But 2020 was weird, particularly when changing employer, and it became significantly more important to be able to communicate across a microphone, a camera and a screen the type of information I would usually have communicated in a meeting room with a large whiteboard and a few colour markers. So I have started looking at way to convey more information that don’t otherwise fit written form, because it’s either extemporaneous, or require a visual feedback.

When I decided to try the first livestream I actually used a real whiteboard, and then I tried this with Microsoft’s Whiteboard. I have also considered the idea of going for a more complex video production by recording a presentation, but I was actually hoping for a more interactive session with Q&A and comments. Unfortunately, it looks only a few people ever appeared in the chatrooms, and most of the time they were people who I am already in contact with outside of the streams.

What I explicitly don’t care for, in these streams, is to become a “professional” streamer. This might have been different many years ago — after all, this very blog was for a long time my main claim to fame, and I have been doing a lot of work behind the scenes to make sure that it would give a positive impression to people, and it involved also quite a bit of investment not just in time but in money, too.

There’s a number of things that I know already I would be doing differently if I was trying to make FLOSS development streaming a bigger part of my image — starting with either setting up or hiring a multiplicator service that would stream the same content onto more than just Twitch. Some of those would definitely be easier to pull off nowadays with a full-time job (cash in hand helps), but they would be eating into my family life to a degree I’m no longer finding acceptable.

I will probably do more livestreams in the upcoming months. I think there’s a lot of space for me to grow when it comes to provide information in a live stream. But why would I want to? Well, the reason is similar to the reason why this blog still exists: I have a lot of things to say — not just in the matter of reminding myself how to do things I want to do, but also a trove of experience collected vastly by making mistakes and slamming head-first into walls repeatedly – and into rakes, many many rakes – which I enjoy sharing with the wider world.

Finally (and I know I said there’s two motivation), there’s a subtlety: when working on something while streaming, I’m focusing on the task at hand. Since people are figuratively looking over my shoulder, I don’t get to check on chats (and Twitter, Facebook, NewsBlur), I don’t get to watch a YouTube video in the background and get distracted by something, and I don’t get to just look at shopping websites. Which means that I can get to complete some open source hacking, at least timeboxed for the stream.

Tangible Results

Looking back at what I proposed I’d be doing, and what I really ended up doing, I can’t say I’m particularly happy about the results. It took me significantly longer to do some things that I expected would take me no time whatsoever, and I didn’t end up doing any of the things I meant to be doing with my electronics project. But on the other hand, I did manage some results.

Beside the already noted £290 collected for Cats Protection (again, thank you all, and in particular Luke!), I fully completed the reverse engineering of the GlucoMen areo glucometer that I reviewed last week. I think about an hour of the stream was dedicated to me just poking around trying to figure out what checksum algorithm it used (answer: CRC-8-Maxim as used in 1wire) — together with the other streams and some offline work, I would say that it took about six hours to completely reverse engineer that meters into an usable glucometerutils driver, which is not a terrible result after all.

What about unpaper? I faffed around a bit to get the last few bits of Meson working — and then I took on a fight with Travis CI which resulted in me just replacing the whole thing with GitHub Actions (and incidentally correcting the Meson docs). I think this is also a good result to a point, but I need to spend more time before I make a new release that uses non-deprecated ffmpeg APIs — or hope that one of my former project-mates feel for me and helps.

Tests are there, but they are less than optimal. And I only scratched the surface of what could be integrated into Meson. I think that if I sat down with the folks who knows the internal in a chat I might be able to draw some ideas that could help not just me but others… but it turns out that involves me spending time in chat rooms, and it’s not something that can be focused on a specific time slot a week. I guess that is one use where mailing lists are still a good approach, although that’s no longer that common after all. GitHub issues, pull requests and projects might be a better approach, but the signal-to-noise ratio is too slow in many cases, particularly when half the comments are either pile-on or “Hey can you get to work on this?”. I don’t have a good answer for this.

The Home Assistant stream ended up being a total mess. Okay, on one half of it I managed to sync (and subsequently get merged) the pull requests to support bound CGG1 sensors into ESPHome. But when I tried to set up the custom component to be released I realized that first, I have no idea how to make a Home Assistant custom component repository – there’s a few guidelines if you plan to get your component into HACS (but I wasn’t planning to), and the rest of the docs suggest you may want to submit it to inclusion (which I cannot do because it’s a web scraper!) – and the second is that the REUSE tool is broken on Windows, despite my best efforts last year to spread its usage.

The funny thing is that it appears to be broken because it started depending on python-debian, which mostly reasonably didn’t expect to have to support non-Unix systems, and thus imported the pwd module unconditionally. The problem is already fixed on their upstream repository, but there hasn’t been a release of the package in four months and so the problem is still there.

So I guess the only thing that worked well enough throughout this is that I can reverse engineer devices in public. And I’m not particularly good at explaining that, but I guess it’s something I can keep doing. Unfortunately it’s getting harder to find devices that are not either already well covered, or otherwise resistant to the type of passive reverse engineering I’m an expert in. If you happen to have some that you think might be a worthy puzzle, I’m all ears.

Production and Support

I have not paid attention too much about production. Except for the one thing: I got myself a decent microphone because I heard my voice in one of the previous streams and I cringed. Having worked too many years in real-time audio and video streaming, I’m peculiar about things like that.

Prices of decent microphones, often referred to as “podcasting” microphones when you look around, skyrocketed during the first lockdown and don’t appear to have come quite down yet. You can find what I call “AliExpress special” USB microphones that look fancy studio mics on Amazon at affordable prices, but they pretty much only look the part, not being comparable in terms of specs — might be just as tinny as your average webcam mic.

If you look at “good” known brands, you usually find them in two configurations: “ready to use” USB microphones, and XLR microphones — the latter being the choice of more “professional” environments, but not (usually) directly connected to a computer… but there’s definitely a wide market of USB capture cards and they are not that much more expensive when adding it all together. The best thing about the “discrete” setup (with an XLR microphone and an USB capture card/soundcard) is that you can replace them separately, or even combine more of them at a lower cost.

In my case, I already owned a Subzero SZ-MIX06USB mixer with USB connection. I bought it last year to be able to bring in the sound from the ~two~ three computers in my office (Gamestation, dayjob workstation, and NUC) into the same set of speakers, and it comes with two XLR inputs. So, yes, it turned out that XLR was a better choice for me then. The other nice thing about using a mixer here, is that I can control some of the levels on the analog side — because I have a personal dislike of too-low frequencies, so I have done a bit of tweaking of the capture to suit my own taste. I told you I’m weird when it comes to streaming.

Also let’s me be clear: unless you’re doing it (semi-)professionally, I would say that investing more than £60 would be a terrible idea. I got the microphone not only to use for the livestream, but also to take a few of the meetings (those that don’t go through the Portal), and I already had the mixer/capture card. And even then I was a bit annoyed by the general price situation.

It also would have helped immensely if I didn’t have an extremely squeaky chair. To be honest, now that I know it’s there, I find it unnerving. Unfortunately just adding WD40 from below didn’t help — most of the videos and suggestions I found on how to handle the squeaks of this model (it’s an Ikea Markus chair — it’s very common) require unscrewing most of the body to get to the “gearbox” under the seat. I guess that’s going to be one of the tasks I need to handle soon — and it’s probably worth it given that this chair already went through two moves!

So hardware aside, how’s the situation with the software? Unfortunately, feng is no longer useful for this. And as I was going through options last year I ended up going for Streamlabs OBS for the “It vastly works out of the box” option. Honestly, I should probably replace it with OBS Studio, since I’m not using any of Streamlabs’ features, and I might as well stick to the original source.

As I said above, I’m not planning to take on streaming as a professional image — if I did, I probably would have also invested in licensing some background music or a “opening theme”. And I probably would have set up the stream backgrounds differently — right now I’m just changing the background pictures based on what I shot myself.

Conclusions

It was a neat experiment — but I don’t think I’ll do this again, at least not in this form.

Among other things, I think that doing one hour of stream is sub-optimal — it takes so long to set up and remind people about the chat and donations, and by the time I finished providing context, I was already a quarter of the hour in. I think two to three hours is a better time — I would probably go for three hours with breaks (which would have been easier during the Pawsome Players events, since I could have used the provided videos to take breaks).

Overall, I think that for this to work it needs a bigger, wider audience. If I was in the same professional space I was ten years ago, with today’s situation, I would probably be having all kind of Patreon subscriptions, with the blog being syndicated on Planet Gentoo, and me actually involved in a project… then I think it would made perfect sense. But given it’s “2021 me” moving in “2021 world”… I doubt there’s enough people out there who care about what goes through my mind.

Glucometer Review: GlucoMen Areo

Two weeks ago I reviewed a GlucoRx Q meter, and while I was doing that I ended up down a rabbithole that I did not expect. The GlucoRx Nexus and Q are both manufactured by the same company, TaiDoc Technology – a Taiwanese medical device manufacturer that manufacturers and sells both personal medical devices such as glucometers and hospital equipment. They clearly manufacture “white label” meters, given that the Nexus (TD-4277) is available under a number of different brands and names — and in particular when looking at that I found it as the “GlucoMen Nexus”.

The reason why that caught my eye is that it’s marketed by the Italian pharmaceutical company Menarini — and I’m Italian so I knew the name. So when I added the information on the Q, I thought I would go and look on whether they also sold that under the GlucoMen brand — they didn’t, but I found another rathole to look at.

It turns out that the GlucoMen brand in the UK is managed by a subsidiary of Menarini called A. Menarini Diagnostics, and they sell not just your usual blood-based glucometers, but also a CGM system (though from what I can see it’s similar to the Dexcom I didn’t like). They also allowed me to order a free meter (the GlucoMen Areo that I’m going to review here), together with the free USB cable to use with it to download the data to a computer.

The fact that this meter required a separate cable hinted me at the fact that it’s not just another TaiDoc rebrand — as I said in the previous post, TaiDoc is the first manufacturer that I find re-using their exact protocol across different devices, and that suggested me that any other modern model from them would also use the same, and since they are using a Silicon Labs 8051 platform with native USB, it sounded unlikely they would need a separate cable.

Indeed, when the meter arrived it was clear that it’s not a TaiDoc device — it’s very different and all the markings on it suggest that Menarini is manufacturing it themselves (though, also in Taiwan). And it’s… interesting.

The first thing that I noted is that the carrying pouch was significantly higher quality than I’m used to. No netting to hold stuff in, but instead a Velcro-held pouch, and enough space to hold their prickling pen with. And on the inside, in addition to the logo of the company, space to write (or attach a label of) name, phone number, address and email. This is the first time in all my years with diabetes that I see such a “posh” pouch. Which turned out not to be entirely too surprising once I noticed that the pouch has the logo of the Italian accessories designer/manufacturer Tucano.

Going instead to look at the meter, what came to my attention quickly is that this meter is the first non-Chinese meter I find that (allegedly) has a “touch free” ejection of the testing strips. The shape and construction of the device roughly reminds me of the basic Tamagotchi from my early teens — to the point that when I push the button to eject the strip I’m afraid I’m going to destroy the LCD panel in the front. Note that unlike the Chinese devices, that have a lever that physically push the strip out of the holder, in this meter the “Push” button only appears to “let go” of the strip, which you can tip into the trash, but does not physically dislodge it at all.

The cable sent to me is another of the common 2.5mm TRS serial adapters — this one using a Silicon Labs CP2104-compatible chip on board (probably in the mould of the USB plug). It plugs at the bottom of the device, in a fashion pretty much identical to the OneTouch Ultra2 devices. Not surprising given I think they were the most common in Italy back in the day.

In addition to the USB/serial connectivity, the device is meant to speak NFC to a phone. I have not figured out how that is supposed to work, to be honest. It seems to be meant mostly to integrate with their CGM solution, and since I don’t have one (and I’m not interested in testing it right now), I don’t expect I’ll figure that out any time soon. Also NFC snooping is not my cup of tea and I’ll gladly leave that to someone else who actually knows how to do that.

Aesthetics continue with the design of the testing strips, that are significantly larger than most other meters I have at hand right now (this is not always a bad thing — particularly for older people, larger strips are easier to use), with a very thin drop placement at the top, and a white front. I’m not trying to play the stereotype of “Italian company cares about style more than substance”, but I have seen enough glucometers by now to say that Menarini definitely had a designer go through this with an eye at fitting everything together.

In terms of usability, the device is pretty straightforward — the display is a standard LCD (so standard I’m sure I saw the same exact segments before), very bright and easily readable. The amount of blood needed in the strip is actually much smaller than you would expect, but also not as little as other meters I have used in the past. This became very apparent during the last of my reverse engineering streams, when I lost three (or four) strips to “Err3” (too little blood), and it reminded me of how many strips I used to lose to not having drawn enough blood when I started using a meter.

In terms of documentation and usability of the markers function there’s something to say there, too. The device supports one or none markers out of four: before meal, after meal, exercise and “check mark” — the check mark is documented in the manual as being pretty much a freeform option. The way you mark these is by pressing (not holding) the power button when you’re looking at the strip result — the manual says to hold the button until the marker flashes, but if you hold it for more than a split second it actually turns off the device, which is not what I expected.

In a strange turn of events, this is also the first meter I saw using a fish (and fish bones) to represent the before (and after) meal. Nearly everything else I have at hand uses an apple (and an apple core), since that’s fruit, and thus sugars. I don’t have an issue on the option per se, but I can imagine this does confuse people at times.

The software is… painfully complex. It seems designed more for medical professionals than home use, which probably explains why the cable is not included by default. It also supports most GlucoMen devices, though it appears to install a long list of drivers for USB-to-serial adapters, which suggests each cable comes with a different serial adapter.

I have actually fully reverse engineered the protocol, live on stream during my Cats Protection Pawsome Players week — you can see the live streams archived on YouTube, but I’ll also post (probably later on) a summary of my discovery. It’s fully supported now on glucometerutils tough. The interesting part there is that I found how the original software has a bug: it can only set the time some of the times, literally, because of the way it provides the checksum to the device. My implementation doesn’t suffer from that bug.

Service Announcement: Pawsome Players Streaming Week

You may remember I have been irregularly streaming on Twitch some of my FLOSS work, which focused almost exclusively on unpaper over the past few months. Now, unrelated to this, Cats Protection, the British no-profit whose primary goal is to rehome cats in need, launched their Pawsome Players initiative, aiming at raising fund through streaming — video games primarily, but not just.

With this in mind, I decided to join the fun, and will be streaming for the whole week at least an hour every day, and work on more FLOSS projects. You can find the full schedule (as well as donate to the campaign) on Tiltify, and if you want to get reminded of a particular night, you can join the events on the blog’s Facebook page.

In addition to wrapping up the Meson conversion of Unpaper, I’m planning to do a little bit of work on quite a few more other projects:

  • I have a new glucometer I want to reverse engineer, and with that comes an opportunity to see my way of working through this type of work; I’m not as entertaining and deep as Hector, but if you have never looked over the shoulder of a “black box” reverse engineer, I think it might be interesting. The code I’ll be working on is likely usbmon-tools rather than glucometerutils, but there’s a chance that I’ll get so far ahead I’ll actually implement the code.
  • Speaking of reverse engineering, I have a few adapters I designed (and got printed) for my Saleae Logic Pro 16. I have not released the schematics for those yet, but I now have the work approvals to. I should make a repository for them and release them, I’ll do that on stream!
  • I want to make some design changes to my Birch Books, which I’ll discuss on stream. It’s going to be a bit more PCB “designing” (I use quotes here, because I’m far from a designer, and more of a “cobbler together”) which is likely going to be scary for those who do know what they are doing.

I’m also open to the idea of doing some live code-reviews — I did lots of those when working at Google, and while for those I had a lot of specific standards to appeal to, a number of suggestions are nearly universal, and I have done this before where I was pointed at some Python project and gave some general advice of what I see. I’d be willing to take a random project and see what I can notice, if the author is willing!

Also, bonus points for anyone who guesses where the name of the fundraising page from.

So I hope I’ll hear from you on the stream chat (over on Twitch), and that you’ll join me in supporting Cats Protection to help find every kitty a forever home (me and my wife would love to be one of those homes, but it’s not easy when renting in London), and reach the £1985 target.

Video: unpaper with Meson — From DocBook to ReStructured Text

I’m breaking the post-on-Tuesday routine to share the YouTube-uploaded copy of the stream I had yesterday on Twitch. It’s the second part of the Unpaper conversion to Meson, which is basically me spending two hours faffing around Meson and Sphinx to update how the manual page for Unpaper is generated.

I’m considering trying to keep up with having a bit of streaming every weekend just to make sure I get myself some time to work on Free Software. If you think this is interesting do let me know, as it definitely helps with motivations, to know I’m not just spending time that would otherwise be spent playing Fallout 76.

Service Announcement: Live Whiteboarding this Thursday

I’m breaking the usual post schedule to point out that this Thursday (2020-07-30) I’ve decided to attempt (again) an online whiteboarding session, this time with virtual whiteboard software instead of a physical one.

The plan is to have a one hour of me rambling on and ranting about some of my projects that are not formed enough to be blog posts (such as my electronics projects), but that I would love to share with the wider community sooner rather than later.

I plan on streaming this on Twitch, for the first time ever, since their Studio software appears to be the most straightforward way to stream a single window on the screen, and it should have a decent chat system as well. So if you’re interested in hearing some of my thought process, or you’re an ex colleague who misses my office rants, you’re welcome to join us there.

The set time is 8pm London time, and I plan on rant around for an hour. If there’s going to be enough interest in this, I’ll try to make this a more regular thing.