I’m moving!

Okay so last time I wrote about my personal status I noted that I had something on the balance, as a new job. Now that I signed the contract I can say that I do have a new job.

This means among other things that I’ll finally be leaving Italy. My new home is going to be Dublin, Ireland. At the time of writing I’m still fretting about stuff I need to finish in Italy, in particular digitizing as many documents as possible so that my mother can search through them easily, and I can reach them if needed, contacting my doctor for a whole blood panel, and the accountant to get all the taxes straightened up.

What does this mean for my Gentoo involvement? Probably quite a bit. My new job does not involve Gentoo, which means I won’t be maintaining it any longer on paid time like I used to before. You can also probably guess that with the stress of actually having a house to take care of, I’ll end up with much less time than I have now. Which means I’ll have to scale down my involvement considerably. My GSoC project might very well be the height of my involvement from now till the end of the year.

On the personal side of things, while I’m elated to leave Italy, especially with the current political climate, I’m also quite a bit scared. I know next to nobody (Enrico excluded) in Dublin, and I know very little of Irish traditions as well. I’ve spent the past week or so reading the Irish Times just to be able to catch a glimpse of what is being discussed up there, but I’m pretty sure that’s not going to be enough.

I’m scared also because this would be the first time I actually leave alone and have to cater for everything by myself, even though with the situation it feels like I might be quite a lot more lucky than most of my peers here in Clownland Italy. I have no idea of what will actually sap away my time, although I’m pretty sure that if it turns out to be cleaning, I’ll just pay somebody to do that for me.

We’ll see what the future brings, I suppose!

And you write a streaming server?

One of the things that I have actually to curse my current job for, is me having to deal with Adobe Flash and in particular with the RTMP protocol. The implementation of the RTMP we’re using on our server is provided by the Red5 project — and they are the ones I’m going to write about now.

Last July I’ve spent days and days looking up documentation about Red5 itself, as we couldn’t reach our resident expert, but at the time, their whole website was unavailable, and was just timing out. Yesterday they told me that this was caused by some kind of DDoS, but even if that’s the case, something doesn’t feel right. Especially because, when I came back from VDD12 at the beginning of September, the website was actually reachable, but with the default configuration of a CentOS 5 system, which makes me think more of a hardware failure than a DDoS.

Right now the website is available, but the trac that should host the documentation is unreachable; a different website (Update (2016-07-29): that website is gone, sigh!) has still some documentation but hasn’t been updated for over two years, for the most part. There is also a company behind the project which on their team’s page lists their dogs, among others. Much as I appreciate companies that have a funny side, this is not funny when the project looks almost entirely dead.

But why am I complaining here? Well, what I gathered from the #red5 channel is that they blame the situation to a DDoS on their website and the fact that every time they try to put the wiki back online it goes offline. Uhm, okay…

Now, there are simple ways to handle DDoS in a fairly decent way that don’t require spending two months changing your setup… and in general it seems like very flimsy that this kind of DDoS are keeping going after two months and you can’t get your documentation up. Beside all your user and admin documentation (i.e. anything that is not developer-oriented) is only available on said wiki? Really?

So here I am, trying to figure out what to do with this hot potato of an install, with server software that is, simply put, completely unreliable (software is as reliable and trustworthy as the people who write it, that’s why you can often see what look like “ad hominem” against particular authors’ software — it’s not a fallacy because you have to trust the author if you run the software). I’m honestly not amused.

Rubygems… UNHACKED!

I have written yesterday about the difficulty of removing the Rubygems hacks we’ve been using — well, today I got a good news: I was able to mostly remove them. I say mostly because there are still a couple of things that need to be fixed, with the help of upstream (all my changes are available on my Rubygems fork in the gentoo branch):

  • there is no way in the original sources to change the location of the system configuration file; for Windows it’s found on the register, for everyone else it’s /etc; I’ve had to change the sources to allow for overriding that with the operating system defaults;
  • there is one test that actually only works if there is no alternate default set installed, as it checks that the binary directory of the gems is the same as the one for Ruby; that is no longer the case for us;
  • JRuby … pretty much fails a whole bunch of tests; some it’s not really its fault, for instance it lacks mkmf/@extconf@ since that there are no C-compiled extensions; others are caused by different problems, such as tests’ ordering or the huge difference in handling of threading between the original implementations and JRuby;
  • I had to patch up a bit the Rakefile so that it can be used without Rubyforge support, which was a requirement for Ruby Enterprise (well at least for my system; the problem is that it still does not build with OpenSSL 1.0, so I have a non-OpenSSL Ruby Enterprise install… and that means that Rubyforge can’t load, and even so the Rubyforge plugin for Hoe);
  • documentation fails to build, not sure on why or how, but it does, when using the rake docs command; I’ll have to check out why and see if I can get it fixed.

But now back to the important good news: you can now safely use the gem command as root! Gems installed with sudo gem install foo will install in /usr/local rather than directly in /usr, no longer colliding or clashing with the Portage-installed packages (which are officially supported). Obviously, both are searched, but the local variant would take precedence, and the user’s home gems get even higher priority for search. This also means that if you don’t care about Gentoo support for the Ruby extensions, you can simply use gem and be done with it.

Now moving a bit back to not-too-good news and next steps. The not-too-good news is that since this dropped all the previously-present hacks, including a few needed to get gem from within Portage, this new version is not compatible with the old gems.eclass. Well, it’s relatively not good news, and partially good news; this is one of the final nails in that eclass’s coffin; we have now a very good reason to get rid of all the remaining packages; soon.

*Note: I have said before that we still lack a way to properly build bindings that are part of bigger packages; luckily none of those require gems.eclass, they rather use the old ruby.eclass which does not require rubygems at all. So it’s fine to deprecate and get rid of the uses of that right now even though we might have more work to do before ruby-ng takes over everything.*

Next steps for what concerns the rubygems package itself, if it was up to me, would be to drop the four gem18, gem19, gemee18 and jgem: all the ruby-fakegem.eclass binary wrappers right now install a single script, which by default uses the currently-configured Ruby interpreter, and you simply have to use ruby19 -S command to start it with a different interpreter. But gem itself is not treated this way and we rather have four copies of it with different names, and different shebangs, which sounds a waste. To change this, among other things, we need to change the eselect-ruby module (which I sincerely would avoid touching if possible).

Further step: supporting multiple Ruby implementation with the current system is mostly easy… but we have no way to change the current interpreter on a per-user or per-session basis; this is something that, as far as I can tell, we could actually take out of the Python book (or maybe the Java book, both have something like that, but the Python failures actually teach us one very important thing: we cannot use a script, we have to use a binary to do that job, even if it’s a very stupid one, as older Linux and other non-Linux systems will fail if you chain interpreters). Again, an eselect-ruby task… I would probably wait for Alex, unless there is enough interest for me to work on it.

Then let’s see, we’ve got to fully migrate out of the old-style virtual/ruby dependencies into =dev-lang/ruby-1.8* so that we can actually implement a new-style virtual that proxies the ssl USE flag, and then start using that; we’ve got to implement a general way to handle one-out-of-multiple target handling for packages that use Ruby as an embedded interpreter rather than build libraries for it; then there is again the problem of finding a way to build bindings within larger packages (or move some of them to build the Ruby extensions by themselves — I remember obexftp could use something like that), there is the fake-specification-generation to be fixed so that it works with bundler, and there are some (corner?) cases where Ruby 1.9 complain about missing fields in our generated files. So as you can see there is a long way still, but we’re going up there, step by step.

And before I leave, I’d like to thank Zeno Davatz for funding the un-hacking of rubygems — if that wasn’t the case, I’d probably have had to post this next week, or month. And if there are other people wanting Ruby 1.9 available faster (or improved 1.8 support), I’m still available to be hired, you can contact me and provide me with an idea of what you’re looking for. I’m available both to implement particular feature pieces, porting of the web of dependencies of given gems, or even just on a retainer basis so that you can have “priority” support for what concerns the general extension packaging in Gentoo.

A personal experience on why the FLOSS movement cripples itself

I have written recently about my standing on anti-corporate feelings and I have written a longer time ago speaking against ‘pirate’ software but today I feel like ranting a bit about the way the FLOSS people who still call proprietary software “illegitimate” are hurting the whole cause.

It so happens that a situation like the one I’m going to describe happened to me with more than a couple prospective clients. With one variation or another, but the basic situation is more or less the same.

I get called up by the prospective customer, that is looking for some kind of software solution, or mixed software-hardware solutions, they present me their need, and after a bit of thinking about it, I find there are two solutions: use Free Software, but usually requires fiddling with set-up, tweaking, and maintenance, or use a proprietary solution, with a high license cost but a smaller requirement for set-up or maintenance.

I usually present the pricing together with a pros/cons fact sheet, pointing out that whatever proprietary solution will rely solely and exclusively on the original vendor, and thus the first time that vendor does something that goes against your wishes or necessities, you’re left with paying for something you can’t make good use of. While this is usually something that is not easily forgotten, they are scared by the price.

I do my best to provide with options that are cheaper than the license of the proprietary software, so that there is a better chance for the Free alternative to be picked up. It’s not difficult given most of the problems I’ve been shown are solved by proprietary software that is very expensive. Also, it is in my personal interest to have them choose the Free Software solution: I get the money and I usually can release at least the fixes (or even better, the customisation) as Free Software, thanks to licenses such as GPL.

But here, most of the hopes get shattered: “You call it Free but we have to pay quite a bit of money for it… we can get the other cheaper, just use eMule”. At least here in Italy, honesty is a rare virtue, too rare a virtue for it to be “exploited” by Free Software. But why do I say that it’s a mistake of FLOSS developers of this is the case? Isn’t it just the doing of a business holder who cares not about legality and using unauthorized copies? Well, yes of course.

On the other hand, talking about illegitimacy and immorality of proprietary software, defending “piracy” (or unauthorized copies if you wish to call them that way), does not really help the cause, it actually gives them arguments such as “well, but even the guys developing that stuff defend using cracked copies of software, why should I pay you to create something anew when there is the program already?”.

As I said before, make sure the people around you understand why they should use Free Software, and that is not by telling them how bad copy-protection is, DRM is, and the “sins” of Windows. It’s by showing them that they have a price to pay to use that software both in direct monetary terms and in flexibility. And maybe more money would flow into the pockets of the Free Software developers that can make it not suck in the areas it currently sucks.

Making money with Free Software development

This post will talk about financial, real-life problems. If you don’t feel like knowing about my personal financial and work situation, feel free to skip this. I say this, because some people complained before that I make it know too often that I don’t get paid for most of my Gentoo-related work. Not sure, maybe they feel at unease because of that. At any rate, this is not conditioning my contributions, most of the time.

While I like working on Free Software and think it’s great if I were to do that just as a pass-time like I used to do five or six years ago, I wouldn’t go much far. Time passed, I grew older, and life got complicated. I don’t have a rent to pay (for now — fortunately), but I’m living alone with my (unemployed) mother since last autumn. Electricity, connectivity, water, LPG, taxes… at the end of the day, I don’t have any more the liberty of turning down many offers. (On the other hand, I had to turn down a few because I was already quite fully booked).

But at least I try my best to develop Free Software when I’m working: thanks to the fact that there is space to choose licenses, sometimes I even can write new Free Software while doing my job. Although this often requires some trickery that is not too obvious when you start the kind of freelancing/consulting work I’ve been doing the past two/three years.

The first obvious way to make money during development is improving the Free Software that is used as part of the infrastructure by the software you’re writing. This might end up getting apparent in spite of various NDAs you might have signed, so it has to be vouched carefully, but in general it’s the kind of information that you know you’ll just show the world. If somebody is following my commits to the tree lately can probably tell something’s up, with me picking up two tied packages on areas that I don’t usually follow that much. And from that you can see that there’s a better pam_pgsql out there which was needed for my task to go on.

Similarly, whenever I used Gentoo for something work-related, I ended up trying to get in the tree all the modification and improvements needed; gdbserver was one of the packages I needed to complete a past project, and it’s now in the tree for everyone to use. it’s not a bad thing. At the same time, you can count in the presence of Rudy in the tree, and thus the fixes for its dependencies that I keep sending upstream.

But there are more “subtle” ways to work on Free Software while doing paid work, and that is to divide the problem you’re working on in separate functions; all that concerns business logic will most likely have to be kept proprietary, at least with most customers, but at the same time, the smaller, general problems that don’t include any business logic in them may be left out of the secrecy. I actually find this a generally good approach for many reasons: you may get code improvements if someone else needs to solve similar problems and decide to use your codebase; or you might get better security reviews (let’s be honest, security issues will appear in your software, whether it’s open- or closed-source; but if it’s open, they are more likely to come in sooner; and sooner might mean your deployment is still limited enough you can cover your bases with upgrades without spending a fortune, or bothering your users).

Of course to be able to split your problem into open-source subprojects, you also have to make it very clear that the business logic of your customer is not at risk. This means that you have to carefully choose the license to use (and in these cases, GPL is rarely an option, unless you work at an arm’s length), and at the same time you have to make sure that the terms are clear to the customer. Telling them “I used this, so you should follow these general terms” is rarely going to help you convince them to use more Free Software when possible, and contribute back to it; it’s actually going to make them wish not to respect the license and just not do anything. Having clear statements about what they have to publish, for GPL and similar licenses is the best way to accomplish that.

And there is another thing you have to learn, and especially teach your customers: the advantage of upstreaming changes. This is something that is generally well-understood by distribution developers, packagers, and all those developers using others’ code but not developing it directly, but most of the time, corporate, proprietary-oriented customers won’t understand the importance of it. If you have to customize some software to suit your need better, whether or not you’re obligated to publish your changes should really have little bearing in whether you should send them upstream or not. Even if the license is very permissive and won’t require you to publish change, you really should send them upstream. Why? Because sooner or later you’re going to have to update your software.

Even when software update is not that welcome a task, because obviously the fewer changes, the less risk of bugs, there are situations where upgrades are inevitable. New architectures, new hardware revisions, new required functionalities, security issues. All these problems might require you to update your software. And when you update software that you customised, then the trouble starts. If your customisations are simply hacks, piled up over and over again, you’re going to have to make some sense out of the hacks, and apply them again over the new codebase, praying that they’ll apply cleanly, or work at all. If you sent your changes upstream, or at least sent a request for he feature you pushed in, there are chances that either the code is already there (and might require just little tweaking) or at least that someone else kept the patches up-to-date.

Obviously, this is very low-scale; there are entire projects with consultancy companies behind them to offer support, customisation and all the frills needed; on the other hand this is what I do for a living myself, most of the time. So please, don’t mess up expecting that all Free Software work is done by volunteers, or that it’s never paid for. Nor try to defend messy code with “nobody was paid to do that”… when you pay people to write code, most of the time, they will not make it nice; they’ll most likely try to cut as many corners as possible. Unless, of course, readability and properness is a requirement (which rarely is).

Health, accounting and backups

For those who said that I have anger management issues regarding my last week’s post I’d like to point out that it’s actually a nervous breakdown that I got, not strictly (but partly) related to Gentoo.

Since work, personal life, Gentoo and (the last straw) taxes all merged this week, I ended up having to take a break from a lot of stuff; this included putting on hold for the week all kind of work, and actually spend most of my time making sure I have proper accounting, both for what concerns my freelancer activity, and home expenses (this is getting particularly important because I’m almost living alone – even if I technically am not – and thus I have to make sure that everything fits into the budget). Thankfully, GnuCash provides almost all the features I need. I ended up entering all the accounting information I had available, dating back to January 1st 2009 (my credit card company’s customer service site hasn’t worked in the past two weeks — since it’s the subsidiary of my own bank, I was able to get the most recent statements through them, but not the full archive of statements since issuing of the cards, which is a problem to me), and trying to get some data out of it.

Unfortunately, it seems like while GnuCash already provides a number of reports, it does not have the kind of reports I have, such as “How much money did the invoices from 2009 consists of?” (which is important for me to make sure I don’t go over the limit I’m given), or “How much money did I waste in credit card interests?”… I’ll have to check out the documentation and learn whether I can make some customised reports that produce the kind of data I need. And maybe there’s a way to set the term of payments that I have with a client of mine (30 days from the end of the month the invoice was issued in… which means if I issue the invoice tomorrow, I’ll be paid on May 1st).

On a different note, picking up from Klausman’s post I decided to also fix up my backup system, which was, before, based on single snapshots of the system on external disks and USB sticks; and moved to use a single rsnapshot system to back everything up in a single external disk, from the local system, the router, the iMac, the two remote servers, and so on. This worked out fine when I tried again the previous eSATA controller I had, but unfortunately it again failed (d’oh!) so I fell back to Firewire 400 but that’s way too slow for rsnapshot to do a full backup hourly. I’m thus trying to find a new setup for the external disk. I’m unsure whether to look up a FireWire 800 card or a new eSATA controller. I’m not sure about Linux’s support for the former though; I know that FireWire used to be not too well maintained, so I’m afraid it might just go down to FireWire 400, which is pointless. I’m not sure about eSATA because I’m afraid it might not be the controller’s fault but rather a problem with (three different kind of) disks or the cables; and if the problem is in the controller, I’m afraid about the chip on it; the one I have here is a JMicron-based controller, but with a memory chip that is not flashable with the JMicron-provided ROM (and I think there might be a fix in there for my problem) nor with flashrom as it is now.

So if you have to suggest an idea about this I’d be happy to hear of it; right now I only found a possibly interesting (price/features) card from Alternate (Business-to-business) “HighPoint RocketRAID 1742” which is PCI-based (I have a free PCI slot right now, and in case I can move it to a different box that has no PCI-E), and costs around €100. I’m not sure about driver support for that though, so if somebody have experience about it, please let me know. Interestingly enough my two main suppliers in Italy seem to not have any eSATA card, and of course high-grade, dependable controllers aren’t found at the nearest Saturn or Mediamarkt (actually, Mediaworld here, but it’s the very same thing).

Anyway, after this post I’m finally back to work on my job.

I fell in love with London

As I said earlier I wanted to write a bit about my vacation in London, so here it comes, if I can write this and make sense before I also fell asleep.

I really really enjoyed my vacation; maybe it was because it was my first vacation ever, maybe because it was my first time out of Italy, or maybe because I overcame my fear of planes once and hopefully for all. I also fell in love with London and with the say stuff there seems to work, in a way that, here in Italy, is hard even to imagine. Unfortunately, one week wasn’t really enough to see half the stuff I wanted to see, starting with Broadcasting House, which is why I hope to come back to London before end of the year.

Now that I can finally fly, I guess I’ll finally be dropping by on conferences; I just need to start looking at the calendars so I can set up my schedule for that. If somebody has a quick way to do that, or has some links for conferences to check the schedules from now on, I’d be happy of the help.

To be honest, I also fell in love with the weather up there, and when I came back I was hit tremendously by the heat here. Given I don’t have A/C at home, and I have been always against the idea of wasting power in stuff like that (it’s waste), I start to have a yet undecided, uncertain idea, about spending next summer in London (or around). Really for now it’s just a random thought, some vague idea. But at least it would be a cool place, and I wouldn’t have problems with language (mostly).

Indeed, the week I spent there didn’t feel extremely out of place for me, since I was a bit in the loop either way thanks to following general news and BBC as well. And I really would love to spend some more time there, with enough time and space to go around visiting Broadcasting House, and maybe some other England areas.

After all, thankfully, my job does not force me to any particular location: I rarely have to go to my customer’s workplace, and especially during the summer this is pretty limited. Just having a laptop, like the one I’m writing on right now, and access to my main box at home (Yamato) allows me to do most of my work without any limitation. This would mean that I wouldn’t be taking three months of vacation, but rather half out-of-office time.

Also, before I forget, after ensuring this was supposed to be the case with the local customer service of 3 Italy, I went out daring to use the 3G connection with my phone while I was in London, under 3 UK coverage. It was, indeed, included in my monthly 10GB limit, and I didn’t spend any money to use it. At the end, the only euro I “wasted” were 2.5 in SMS (because the hotel I was staying at wasn’t covered by 3 UK and I used SMS to sync with my friends on what time to wake up) and 2 for a connection on the Orange network when I didn’t note the out-of-network and thus didn’t kill it soon enough. Not bad considering a lot of people end up paying lots for using their phones outside of Italy (and I did use it quite a bit, both to call Italy and UK itself — some friends had an UK cellphone already).

So I went on vacation and…

And now I got to work 18 hours a day. I wanted to write a bit about my one week in London but the time, since the start of the week, have been more than entirely spent on two main work projects. I even had to give up one project entirely because I didn’t have much time after an urgent consultancy started this week.

I have also very limited time for Gentoo as you might have noticed; I’m trying to do some of the standard maintenance operation on my packages and on Gentoo as a whole, but it really doesn’t have the same speed as I used to have. So please bear with me if the next weeks will be very slow on my Gentoo side.

On the bright note, I’ve been able to update PulseAudio to the latest test version, so you might want to try that out. And I’ll be working on a few more notes here and there, included autotools mythbuster .

What’s going on with me

If you haven’t noticed, I’ve been less involved in Gentoo in the past weeks, and I’m still not sure how long it could continue this way. The main reason is that I’m currently handling multiple work projects at once, and that requires most of my time, robbing it out from Gentoo, which I’m neither paid for nor currently is involved in my work projects.

I’m currently fighting the jobs on various fronts; most of what I’m working for is closed-source, some is web based and other totally isn’t, it’s a real mix and match of problems, and at the same time I’m also expanding in other kind of computer-related services, including standard Home/SoHo PC assistance (pushing for Linux whenever it’s fit), and into reselling/hosting of domain. Probably stuff that most of my colleague and most of the Gentoo powerusers have done or are still doing. Really I don’t like this, but it pays the bills.

In this note, I’ve also been trying to expand my static website generator so to support multi-language websites; this comes from a necessity of myself to actually have a website where I can write about my services for the Italian customers, who prefer an Italian-written site, and at the same time, have the website serve my usual English-based content. It’s going to be tricky and I already foresee it’s going to require me to rewrite most of the code in it to handle the XML language attributes, but sooner or later I’ll get it to work.

Of course, once I’m done with the website generator side, I’ve gotta set up Apache to serve the correct language but still allow override, and there’s where the fun is going to start since I really want the most automation and the most flexibility in a single Apache configuration, with dynamically-served static files!

On other notes, my desktop views are currently full of terminals and other craps in both computers, Yamato (12 desktops) and Merrimac (9 desktops); there are terminals with the tinderbox, terminals and emacs with feng sources, two MonoDevelop instances (yes I know it’s crazy) and a number of terminals and emacs spent between Yamato and Vanguard (this server) since I spent most of the day trying to help tracking down a bug in Typo with caching (now solved, luckily, since the blog without caching tends to get quite a hit).

And finally, if you’re interested in anything at all in particular from me, since I really don’t have much free time myself, you should either wait, ask me, bribe me or hire me (especially if you’re in Europe since I actually can invoice you there!). Yes I should try to sell myself better on my site rather than doing these shameless plugs.

The hardware curse


Those reading my blog or following me on identi.ca might remember I had some problems with my APC SmartUPS, with a used up battery that didn’t work properly. After replacement, I also had some troubles (which I’m not yet sure are worked out now to be honest); and at the end I settled for getting a new one to replace or work side-by-side it if it’ll work out properly. This is why in the photo above you can see two UPSes and one box (Yamato), even though there are actually three here (just one other turned on right now, though).

I’m not sure what has caused it, but since a little before I actually started activity as a self-employed developer and consultant, I’ve had huge hardware failures: two laptops, the very same week (my mother’s iBook and my MacBook Pro), the drum of the laser printer, the external iomega HD box (which was already a replacement for the same model failing last November), and lastly the (software) failure of the PCI soundcard.

Around the same time I also ended up needing some replacement for hardware that was now sub-optimal for my use (the Fast Ethernet switch was replaced by a Gigabit one because now there is Merrimac – my iMac – always turned on, and it makes heavy use of networking (especially with iSCSI), the harddisk, which there themselves replaced just last March, and helped out by one out of two disks in the external box, started being too small (thus why I got an extra one, a FreeAgent Xtreme running on eSATA, for one more terabyte of storage), my cordless phone required me to get another handset so that my mom’s call wouldn’t get to bother me, my cellphone (just recently) is being phased for a Nokia E75 so I could get a Business account (it was a Nokia E71 before), I got an HP all-in-one so that I had an ADF scanner to go in pair with the negative film scanner for archive purposes, and some more smaller things to go with that. I should also update, again, the router: after three years of good service, my 3Com starts to get the hit of age, and also starts to hit on limitations, including the very artificial limit of 32 devices listed in the WLAN MAC-addresses (and the fact that it doesn’t support IPv6).

Then there have been costs much more tied to work (not like the stuff I have mentioned is not part of my job anyway), like proprietary software licenses (Parallels Desktops, Visual Studio 2008, and soon Microsoft Office, Avira and Mac OS X Snow Leopard) and the smartcard reader. And of course rents for the new vserver (vanguard) and phone bills to pay. Given the amount of work my boxes do during the day, I’ll soon switch the power company over to me rather than my family and pay for that too, unless I decide to move the office on a real office (possibly one I can stay at any hour), and just keep one terminal at home or something like that (but then, what would I keep?). Oh and obviously there are a few more things like business cards and similar.

Now, all these are work expenses, so they are important up to a point; I actually get paid well enough to cover for these at the moment, even though I have now a quite funky wishlist which includes both leisure-related and work-related items (you’re free to help me with both, j/k). The problem is that I would have been much better off if I didn’t have all this mess. Especially considering that, as I have said before, I really wish I could get out of home soon.

But anyway, this is still work time!