David Aha has some online information which includes links to some of his papers on IBL, answers to a couple questions on running IBL, and the source code.
You will find the executable for IBL in /u2/admin/cs572ta/bin. If you want, you may download the C source code or view David Aha's disertation, ML journal article, IJMMS article, IJCAI-89 article, IMLW-89 article, which are all on IBL.
There is a bug (perhaps a feature?) in the current executable for IBL that limits the number of training examples it can handle to about 2500. I would imagine that this is hard coded and could be overcome by downloading the source code and entering a larger number somewhere.
The format for the ibl command is:
ibl description namesfile trainfile testfile outputfile seed [options]
To create data for ibl run xlate on your data sets with the "-a ibl" option. For example, xlate -f echoc -a ibl will create the following files for ibl:
You can then run ibl on these files by typing:
ibl echoc.desc echoc.names echoc.train echoc.test iblresults.text 10
The results will be saved to the iblresults.text file. Here is what the screen output looks like. A description of what the results mean and the various optional parameters (and a few other items) can be found in ibl.text