Integrating LORLS with Koha
At Loughborough we use Ex Libris’s Aleph as our Library Management System (LMS). Our LORLS reading list system also makes use of Aleph in a number of ways:
- To provide bibliographic information via Z39.50, allowing people to quickly add works to reading lists by simply entering the ISBN,
- In batch jobs to check which works the library holds, again using Z39.50,
- To find how many items for a work are held and how many are available for loan which is displayed on a work’s detail pop up in CLUMP. This has been done using the Ex Libris X Server product (not to be confused with X11 servers!),
- To build up lists of current and past loans, used for purchase prediction, high demand reporting and usage tracking. This is mostly done using custom local Perl scripts that interrogate Aleph’s Oracle database directly and then provide an XML API that LORLS code calls.
We’ve recently had another site say they are interested in using LORLS, but they use the Koha LMS. They asked us what would be needed to allow LORLS to integrate with Koha? Luckily Koha is open source, so we could just download a copy and install it on a new virtual machine to have a play with and see what was needed.
Looking up bibliographic information was pretty simple to support. Koha has a built in Z39.50 server, so all we needed to do was to tweak the LUMP.pm file on our dev server to point the Z3950Hostname(), Z3950Port() and Z3950DBName() to point to our new VM, the port that Koha’s Z39.50 server is running on and the new Z39.50 database name (which appears to default to “biblios”). That seemed to work a treat.
Get item holdings for works was a bit more involved. Koha obviously doesn’t have Ex Libris’s X server, so we needed another way to get similar data. Luckily Koha does implement some of the Digital Library Federation Integrated Library System – Discovery Interface recommendations. One of these Application Programming Interfaces (APIs) is called GetRecords() and, given a set of system record identifiers, will return an XML document with just the sort of information we need (eg for each item linked to a work we get things like item type, whether it can be loaned, what its due date is if it is on loan, if it is damaged, lost, etc).
Unfortunately LORLS doesn’t know anything about Koha’s system record identifiers, and the GetRecords() ILS-DI API doesn’t appear to allow searches based on control numbers such as ISBNs. To get the system record numbers we can however fall back on Z39.50 again, which Koha uses to implement the ILS-DI SRU type interfaces. Searching for a Bib-1 attribute of type 1007 with the value set to be the ISBN gets us some nice USMARC records to parse. We need to look at the 999 $c field in the MARC record as this appears to be the Koha system record identifier.
Just to make things interesting we discovered accidentally that in Koha you can end up with more than one work for the same ISBN (and each work can then have multiple physical items). I guess this is a flexibility feature in some way, but it means that we need to make sure that we get all the system record identifiers that match our ISBN from LORLS and then pass all of these to the Koha GetRecords() API. Luckily the API call can take a whole set of system record identifiers in one go, so this isn’t too much of a problem.
One thing we do have to have in the code though is some way of distinguishing between loan categories (long loan, short loan, week loan, reference, etc). In Koha you can create an arbitrary number of item types which can correspond to loan categories, to which you can then assign things like differing loan rules. We slipped in:
- BK – normal long loan book (the default in Koha it seems)
- REFBOOK – a book that can’t be loaned.
- SL BOOK – a short loan book (usually loaned for less than a day – our “high demand” stock),
- WL BOOK – a book that can be loaned for a week (effectively moderately in demand works).
Our code currently then has these hard coded in order to return the same sort of holdings Perl structure that Aleph did. Extra item types assigned in Koha will need to been inserted into this code – we might have to think of a “nice” way of doing this if folk make lots of these changes on a regular basis but I suspect item types are one of those things that are configured when an LMS is setup and rarely, if ever, tweaked again.
In the StructuralUnit.pm Perl module I created a new method called Koha_ILS_DI_Holdings() to implement the new Koha item holdings and availability code. The existing Holdings() method was renamed to Aleph_Holdings() and a new Holdings() method implemented that checks a new Holdings() method in LUMP.pm for the name of a holdings retrieval algorithm to use (currently now either “Koha:ILS-DI” which selects the new code, or anything else defaulting back to the old Aleph_Holdings() method). This means that if someone else comes along with XYZ Corp LMS that uses some other whacky way of retrieving holdings availability we can simply write a new method in StructuralUnit.pm and add another if clause to the Holdings() method to allow a quick LUMP.pm change to select it. The advantage of this is that other code in LORLS that uses the Holdings() method from StructuralUnit.pm doesn’t have to be touched – it is insulated from the messy details of which implementation is in use (ooh, object oriented programming at work!).
This appears to work on our test set up and it means we can just ship a new StructuralUnit.pm and LUMP.pm file to the folk with Koha and see how they get on with it. Our next trick will be getting loan history information out of Koha – this may take a bit more work and its really replacing functionality that we’ve mostly implemented solely for Loughborough (using our custom scripts as even Aleph didn’t provide a usable API for what we needed). It doesn’t appear at first glance that Koha’s ILS-DI APIs cover this use case – I guess we’re a bit odd in being interested in work loan histories!
Comments are closed.