… you can make your own, or try at least.
I maintain in portage a little ebuild for uif2iso
; as you probably already know, the foo2iso
tools are used t convert various type of proprietary disk images from Windows proprietary software into ISO9660 images that can be used under Linux. Quite obviously, making unit testing out of such a tool is pointless, but regression testing at tool level might actually work. Unfortunately for obvious reasons upstream does not ship testing data.
Not exactly happy with this, I started considering what solution I had, and thus my decision: if upstream does not ship with any testsuite, I’ll make one myself. The good thing with ebuilds is that you can write what you want for the test in src_test
. I finally decided to build an UIF image using MagicISO on my Windows XP vbox, download it together with the MD5 digest of the files I’d put in it conditionally to the test USE flag, and during the test phase convert it to ISO, extract the files, and check that the MD5 digest is correct.
Easier said than done.
To start with I had some problem deciding what to put on the image; of course I could use some random data, but I thought that at that point I could at least make it funny for people to download the test data, if they wanted to look at it. My choice fell on just finding some Creative Commons-licensed music and use a track from that. After some looking around, my choice went to the first track of Break Rise Blowing by Countdown on Jamendo.
Now, the first track is not too big so it’s not a significant overhead to download the test data, but there is another point here: MagicISO has three types of algorithms used: default, best compression and best speed; most likely they are three compression levels in lzma or something along those lines, but just to be safe I’d just put all three of those to the test. The resulting file with the three UIF images and the MD5 checksums was less than 9MB, so an acceptable size.
At that point, I started writing the small testsuite, and the problem started: uif2iso always returns 1 at exit, which means you can’t use || die
or it would always die. Okay good enough, just check that the file was created. Then you have to take the files out, nothing is that easy when you got libarchive that can extract ISO files like they were tarballs; just add that as a dependency with the test USE flag enabled, a bit of overhead but at least I can easily extract the data to test.
It seems instead that the ISO file produced by uif2iso
is going to be a test for libarchive instead, since the latest release fails to extract it. I mailed Tim and I hope he can fix it up for the next release (Tim is fantastic with this, when 2.5.902a was released, I ended up finding a crasher on a Portage-generated binpkg, I just had to mail it to him, and in the next release it was fixed!). The ISO file itself seems fine, since loop-mounting it works just fine. The problem is that I know no other tool that can extract ISO images quickly and without having to command it file by file (iso-read
from libcdio can do that, it’s just too boring); if somebody has suggestions I’m open to them.
This is the fun that comes out of writing your own test cases I guess, but on the other hand I think it’s probably a good idea to keep the problematic archives around, if they have no problems with licenses (Gentoo binpkgs might, since they are built from sources and you’d have to distribute the sources with the binaries, which is why I wanted some Creative Commons licensed content for the images), since that allows you to test stuff that broke before to ensure it never breaks again. Which is probably the best part of unit, integration and system testing: you take a bug that was introduced in the past, fix it and write a test so that, if it is ever reintroduced it would be caught by the tests rather than by the users again.
Has anybody said FATE ?
You say: Quite obviously, making unit testing out of such a tool is pointlessI’m not sure I agree. Unit tests are different from integration and regression tests.I think of unit tests as white-box testing whereas integration tests are black-box testing. Regression tests are integration tests created from actual problems you’ve encountered over time.There’s a school of programming that advocates writing short, modular functions/methods/modules/programs. The unit tests assist in finding bugs quickly during development and assist in developing these little pieces. So if your program has these little modular routines, you’d stub out or create a “mock object” for the libcdio part for example. That way you could test your part that does whatever it needs to independent of whether the other code like libcdio is correct.There are two other advantages of unit tests. First they document the code, e.g. how to use it (since that’s what they do :-). And this “documentation” is pretty much guaranteed to be kept be accurate, because when it breaks, everyone knows. Second, the tests tend to be pretty fast. So if you’ve introduced an error you can find out quickly.That said, there has been a tendency to think of unit tests as the panacea of testing. It’s not. Of course what people care about is that the program does give the right output for some given input, however it manages to do it. Whether little pieces of it work in isolation is not crucial (although in practice you’d never get there if the pieces were buggy). If you do a refactoring of code or change any of the public or internal APIs, the unit tests which use them will likely have to be changed. However its possible and even likely that an end-to-end result is the same after a major refactoriing, so it is possible integration tests will be less subject to changes as the code gets rewritten.Yehuda Katz in his 2008 Ruby Conference kind of bashes unit tests and probably you share similar views: http://rubyconf2008.confrea…However I don’t think he addresses the full story. If you are writing code where you already know or have a good idea of what the outcome will be it is make sense to rigidly adhere to interfaces you’ve designed. However sometimes when you are doing something totally new, you don’t know what the interface will be like, and even if you think you do, it is likely to change a great deal. So then casting things in concrete requires lots of overhead in making changes reducing “agility”, much in the same way that redoing an API requires changing all the black-box unit tests. Pick your poison.
When I started reading this blog I thought the obvious way to test the output was an MD5 checksum. I’m not sure if this would stay stable over releases, but at least for one version the output should be the always the same?
Oh I’m not advocating against unit tests at all. If you look at Ruby Elf’s repository, the big chunk of it is unit testing, although now I’m trying to add some more complete testing on the tools themselves, since unit tests alone are not useful.But for things like uif2iso I don’t really see too much sense into writing full blown unit testing. On the other hand, I’m not a developer for it, I just package it up.Christian, yes the MD5 of the ISO file between versions would change, and my reason to write regression testing is to make sure that the _conversion_ happens properly between releases. If a future release will make the file much smaller by removing useless data, or something along those lines, I want to keep it staying correct anyway.Thus why I care about the integrity of data first; after that I can work on checking the ISO metadata too, but I want to ensure the content is what I expect.
http://www.gnu.org/music/fr…* running away