This Time Self-Hosted
dark mode light mode Search

Blocking old user agents

I’ve been looking through awstats and the logs of my blog today after talking about that with Petteri the other day. And I noticed quite a few interesting thing in the list of browser versions.

Beside hits reporting “Firefox/8.10” as version (because of most likely a broken packaging in Ubuntu that reports as user agent “Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.3) Gecko/2008101315 Firefox/8.10 (intrepid) Firefox/3.0.3”, I got a fair number of pre-2 versions of Firefox, as well as pre-5 versions of Internet Explorer and Netscape. and Firebird/Phoenix branded browsers.

A rapid check shows that stuff like “Firefox/0.10.0” is just spammers. so this is giving me an idea: what if I modify the blog so that comments result disabled if the user agent is too old? Or a known spammer one, or an RSS reader (which cannot leave comments)? Optionally it could reject requests without an User-Agent field too.

Now, I know this is not going to be free of false positive since there are people out there who think that the whole User-Agent header is ruining their privacy and thus intentionally remove or make it invalid. I sincerely don’t give a crap. I don’t see how User-Agent is a privacy invasion when it’s needed for proper technical reasons.

So anyway, does anybody know if there is anything like this already or if I should be starting from scratch?

Comments 9
  1. Problem is I don’t use Apache :)This site is hosted on lighttpd, and I’d prefer not to replace it anytime soon, although I guess I had some beefs with lighttpd that might warrant that one day or another.Also I’d rather not exclude access as a whole or just answer 403 on the POST, I was more thinking something along the lines of a softfail note on the posts’ pages “Comments are disabled from your user agent version, please don’t mask your user agent or upgrade to a newer version” to avoid baffling users of old systems, if there are any.Or maybe providing an obnoxious captcha just for those cases.

  2. I’m currently trying out Apache+Passenger to replace lighttpd, I just need to be able to get a permanent redirect map and then I should be sold..

  3. A captcha would indeed be the better solution. Some users are stuck with old browsers.

  4. Interesting Idea. I’m not sure how successful it will be but if you want some basic info on how old each browser is there is a listing here that gives some live stats.http://webbugtrack.blogspot…Anything not green is old.

  5. Needed for what “proper technical reasons”? The RFC for HTTP 1.1 says including a User-Agent header is a SHOULD, not a MUST. And it says the header is for collecting stats, tracking buggy agents, and working around specific agent limitations — not for denying access.

  6. I don’t think that when HTTP/1.1 was designed there was so much spam as it is now.And sure thing, working around specific agent limitations seems just the right definition. I’d be working around the specific limitation of some user agents being just pure and simple spammers.And I still find it nicer to deny access to comments based on user agent than having to type stupid captchas every time.

  7. Just to spite you, I’ve been browsing your site using my NCSA Mosaic 2.7b6. You wouldn’t block the first real browser, now would you 😉

  8. There is Bad Behavior which is more or less the same idea, only more advanced… but it’s in PHP.

Leave a Reply to pankkakeCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.