I’m a Geek!

The title, by itself, is pretty obvious; I’m a geek, otherwise I wouldn’t be doing what I’m doing, even with the bile refluxes I end up having, just for the gratitude of a few dozens users (which I’ll thank once again from my heart; you make at least part of the insults I receive bearable). But it’s more a reminder for those who follow me since a long time ago, and who could remember that I started this blog over four years ago, using Rails 1.1 and a Gentoo/FreeBSD install.

Well, at the time my domain wasn’t “flameeyes.eu”, which I only bought two years ago but rather the more tongue-in-cheek Farragut.Flameeyes.Is-A-Geek.org where Farragut was the name of the box (which was contributed by Christoph Brill and served Gentoo/FreeBSD as main testing and stagebuilding box until PSU/MB gave up).

At any rate, I’ve been keeping the hostname, hoping one day to be able to totally phase it out and get rid of it; this because while at the start it was easy to keep it updated, DynDNS has been pressing more and more for free users to register for the “pro” version. Up to now, I’ve been just refreshing it whenever it was on the verge of expiring, but.. since the latest changes will not allow me to properly re-register the same hostname if it failed, and a lot of websites still link to my old addresses, I decided to avoid problems and scams, and I registered the hostname with DynDNS Pro, for two years, which means it won’t risk expiration.

Given that situation, I decided to change the Apache (and AWStats) configuration so that the old URLs for the blog and the site won’t redirect straight to the new sites, but rather accept the request and show the page normally. Obviously, I’d still prefer if the new canonical name is used. Hopefully, at some point in time, browsers and other software will support the extended metadata provided by OpenGraph which not only breaks down the title in site and page title (rather than the current mess of different separators between the two in the <title> element!), but also provides a “canonical URL” value that can solve the problem of multiple-hostnames as well (yes that means that if you were to link one post of mine on Facebook with the old URLs it’ll be automatically translated to the new, canonical URLs).

But it’s not all stopping here; for the spirit of old times, I also ended up looking at some of the articles I wrote around that time, or actually before that time, for NewsForge/Linux.com (as I said in my previous post noted). At the time, I wasn’t even paid for them, but the only requirement was a one year exclusive; last one was in December 2005, so the exclusive definitely expired a long time ago. So, since their own website (now only Linux.com, and changed owner as well) is degrading (broken links, comments with different text formatting functions, spam, …) I decided to re-publish them on my own website in the newly refurbished articles section and, wait for it…

I decided to re-license all three of the articles I wrote in 2005 under CreativeCommons Attribution-ShareAlike license.

Update (2017-04-21): as it happens my articles section is now gone and instead the articles are available as part of the blog itself.

Okay, nothing exceptional I guess, but given there was some doubts about my choices of licenses, this actually makes available a good chunk of my work under a totally free license. I could probably ask Jonathan whether I could do something like that to the articles I wrote for LWN, but since that site is still well maintained I see no urgency.

I should also be converting from PDF/TeX to HTML the rest of the articles on that index page, but they are also not high on my priority list.

Finally, I’m still waiting on FSF to give me an answer regarding the FSWS licensing — Matija helped me adapt the Autoconf exception into something usable for the web… unfortunately the license of the exception itself is quite strict, and so I had to ask FSF for permission of using that.. the request has been logged into their RT, I hope it’ll get me an answer soon… who knows, FSWS might be my last “gift” before I throw the towel.

Yes, again more static websites

You might remember I like static websites and that I’ve been working on a static website framework based on XML and XSLT.

Upon necessity, I’ve added support to that framework for multi-language websites; this is both because people asked for my website to be translated in Italian (since my assistance customers don’t usually know English, not that well at least), and because I’m soon working on the website for a metal group that is to be available in both languages too.

Now, making this work in the framework wasn’t an easy job: as it is now, there is a single actual XML document that the stylesheet, with all its helper templates, gets applied to, it already applied a two-pass translation, so that custom elements (like the ones that I use for the projects’ page of my site – yes I know it gets stuck when loading) are processed properly, and translate into fsws proper elements.

To make this work I then applied a similar method (although now I start to feel like I did it in the wrong order): I create a temporary document filtering all the elements that have no xml:lang attribute or have the proper language in that, once for each language the website is written in. Then, I apply the rest of the processing over this data.

Since all the actual XHTML translation happens in the final pass, this pass become almost transparent to the rest of the processing, and at the same time, pages like the articles index can share the whole list of articles between the two versions, since I just change the section element of the intro instead of creating two separate page descriptions.

Now, I’ll be opening fsws one day once this is all sorted out, described and so on, for now I’m afraid it’s still too much in flux to be useful (I haven’t written a schema of any kind just yet, and I want to do that soon so I can even validate my own websites). For now, though, I can share the code I’m currently using to handle the translation of the site. As usual, I don’t rely on any kind of dynamic web application to serve the content (which the frameworks generate in static form), but rather I rely on Apache’s mod_negotiation and mod_rewrite (which ship with the standard distribution).

This is the actual configuration that vanguard is using to do the serving:

AddLanguage en .en
AddLanguage it .it

DefaultLanguage en
LanguagePriority en it
ForceLanguagePriority Fallback

RewriteEngine On

RewriteRule ^(/[a-z]{2})?/$     $1/home [R=permanent]

RewriteCond ${REQUEST_FILENAME} !-F
RewriteRule ^/([a-z]{2})/(.+)$ /$2.$1

(I actually have a few more rules in that configuration file but that’s beside the point now).

Of course this also requires that the MultiView option is also enabled, since that’s what makes Apache pick up the correct file without having map files around. Since the file are all named like home.en.xhtml and home.it.xhtml, requesting the explicit language as suffix allows Apache to just pick up the correct file, without having to mess with extra configuration of files.

Right now there are a few more things that I have to work on, for instance the language selection on the top should really bring you to the other language version of the same page, rather than the homepage. Or it works fine on single-language site just if you never use xml:lang, I should special-case that. For this to work I have to add a little more code to the framework, but it should be feasible in the next weeks. Then there are some extra features I haven’t even started implementing but just planned: an overlay based photo gallery, and some calendar management for ICS and other things like that.

Okay this should be it for the teasing about fsws; I really have to find time to set up a repository for, and release, my antispam rules, but that will have to wait for next week I guess.

The xine website: design choices

After spending a day working on the new website I’ve been able to identify some of the problems and design some solutions that should produce a good enough results.

The first problem is that the original site did not only use PHP and a database, but also misused them a lot. The usual way to use PHP to avoid duplicating the style is usually to get a generic skin template, and then use one or more scripts per page that gets included in the main one depending on parameters. This usually results in a mostly-working site that, while doing lots of work for nothing, still does not hog down the server with unneeded work.

In the case of xine’s site, the whole thing loaded either a static html page that gets included or a piece of php code that would define variables, which, care of the main script, would then be replaced in a generic skin template. The menu would also not be written once, but generated on the fly for each page request. And almost all the internal links in the pages would be generated by a function call. Adding the padding to the left side of the menu entries for sub-pages was done by creating (with PHP functions) a small table before the image and text that formed the menu link. In addition to all this, the SourceForge logo was cycling on a per-second basis, which meant that an user browsing the site would load about six different SourceForge images in the cache, and that no two request would have got the same page.

The download, release, snapshots and security pages loaded the data on the fly from a series of flat files that contained some metadata about them, and that then produced the output you’d have seen. And to add client-side timewaste to what was already a timewaste on the server side, the changes in shade of the left-handed menu were done using JavaScript rather than the standard CSS2 :hover option.

Probably because of the bad way the PHP code was written, the site had all the crawlers stopped by robots.txt, which is a huge setback for a site aiming to be public. Indeed, you cannot find it on Google’s cache system because of that, which meant that for last night I had to work with the WayBack machine to see how the site appeared earlier. And it was from one year ago, not what we had a few weeks ago. (This has since stopped being a problem since Darren gave me a static snapshot of the thing as seen on his system).

To solve these problems I decided a few things for the new design. First of all as I’ve already said it has to be entirely static after modification, so that the files served are just the same for each request. This includes removing visit counters (who cares nowadays, really), and the changing SourceForge logo. This ensures that crawlers and users alike will see the exact same content over time if it doesn’t change, keeping caches happy.

Also, all the pages will have to hide their extensions, which mean that I don’t have to care whether the page is .htm, .html or .xhtml. Just like my site all the extensions will be hidden so even the switch to a different technology will not invalidate the links. Again this is for search engine and users alike.

The whole generation is done with standard XSLT, without implementation-dependent features, which means it’ll work with libxslt just like with Saxon or anything else. Although I’m going to use libxslt for now since that’s what I’m using for my site as well. By using standard technologies it’s possible to reuse them for the future without relying on versions of libraries and similar. And thanks to the way XSLT has been designed, it’s very easy to decouple the content from the style, which is exactly what a good site should do to be maintainable for a long time.

Since I dislike custom solutions, I’ve been trying very hard to avoid using custom elements and custom templates outside the main skin, the idea is that XHTML usually works by itself, and adding a proper CSS will take care of most of the remaining stuff. This isn’t too difficult after you get around the problem that the original design was entirely based upon tables rather than proper div elements, but the whole thing has been manageable.

Besides, with this method adding a dynamically-generated (but statically-served) sitemap is also quite trivial, since it’s just a different stylesheet applied over the same general data for the rest of the site.

Right now I’m still working on fixing up the security page, but the temporary not-yet-totally-live site is available for testing and the repository is also accessible to see the code if you wish to see how it’s actually implemented. I’ve actually made some sophistication to the xine site I didn’t use for my own, but that will come with time.

The site does not yet validate properly, but the idea is that it will once it’s up, I “just” need to get rid of the remaining usage of tables.

The xine website: intro

As it turns out, the usual xine website has gone offline since a few days ago. Since then, Darren set up a temporary page on SourceForge.net servers, and I’ve changed the redirect of xine-project.org which is now sorta live with the same page that there was on SourceForge.net, and the xine-ui skins ready to be downloaded.

Since this situation cannot be left to happen for a lot still, I’ve decided to take up the task to rebuild the site on the new domain I’ve acquired to run the Bugzilla installation. Unfortunately the original site (which is downloadable from the SourceForge repositories) is written in PHP, with use of MySQL for user-polling and news posting, and the whole thing looks like a contraption I don’t really want to run myself. In particular, the site itself is pretty static, the only real use of PHP on it is not having to write boilerplate HTML for each release, but rather write a file describing them, which is something that I’ve used to do myself for my site .

Since having a dynamic website for static content is far from my usual work practises, I’m going to do just what I did for my own website: rewrite it in XML and use XSLT to generate the static pages to be served by the webserver. This sounds complex but it really isn’t, once you know the basic XML and XSLT tricks, which I’ve learnt, unfortunately for me, with time. On an interesting note, when I’ve worked on my most complex PHP project, which was a custom CMS – when CMS weren’t this widespread! – for an Italian gaming website, now dead, I already looked into using XSLT for the themes, but at the time the support for it in PHP was almost never enabled.

I’m still working on it and I don’t count on being able to publish it this week, but hopefully once the site will be up again it’ll be entirely static content. And while I want to keep all the previously-available content, and keep the general design, I’m going to overhaul the markup. The old site is written mostly using tables, with very confused CSS and superfluous spacer elements. It’s not an easy task but I think it’s worth to do it especially since it should be much more usable for mobile users, of which I’m one from time to time.

If I find some interesting technicality while preparing the new website I’m going to write it here, so keep reading if you’re interested.