The Loughborough Online Reading List System (LORLS) is the system we developed here at Loughborough to manage the resources for directed student reading. Academics put their reading lists into the system, and the students can then see them, both directly via a web interface and also embedded in the University’s Moodle Virtual Learning Environment. LORLS also provides the library with management information to allow them to adjust stock and check which modules have up to date reading lists available. To do this, LORLS links in to other University IT systems such as the student data, the Active Directory and the Library Management System (LMS)
We started to write LORLS just before the turn of the 21st century, so its been around a bit and has already seen a switch from one LMS (BLCMP’s Talis) to another (ExLibris’s Aleph). In the summer of 2017 we’ve been involved with another LMS transition, this time from Aleph to Koha. We already have some experience with tying the basic elements of LORLS into Koha as the Dublin Business School use both and we helped them to get LORLS working with Koha for catalogue searches and some of the reports. This integration uses Z39.50 and is relatively painless to do: LORLS just needs to be told the new host name, port, database name, username and password (if any) to access the new LMS’s Z39.50 server. For once, standards usage mean it does interoperate OK.
However at Loughborough our LORLS configuration also includes some localised code to provide some extra features that aren’t in the main LORLS distribution, some of which were written after the previous LMS switch from Talis to Aleph. One of these is the “purchase predictor”. This a back end script that runs nightly and tries to use reading list data, information the number of students on modules, the loan history of items in the LMS and a few other bits and bobs to provide library staff with suggestions as to which books should be considered for acquisition (and why).
The purchase predictor needs more access to the LMS than Z39.50 provides for. With Aleph, it has been using some homegrown Perl scripts on our Aleph server that we nicknamed “AIM” to extract required data out of the Aleph Oracle database. With the move to Koha, these AIM scripts would need to be rewritten or replaced.
The data we need from the LMS is:
- Budget codes and amounts for departmental spending (so that we know which departments to bill books or parts of books to based on module usage, student numbers, etc)
- Details of holdings for a work. In other words, how many copies (items) do we have for a given ISBN?
- Details of outstanding acquisitions/orders that have been placed but not yet processed or fulfilled. This is required to stop the purchase predictor from continually trying to purchase the same popular works over and over again before orders have arrived and been made shelf ready.
- The loan history of items for a work in a given period. This is used to determine when items for a work are heavily used – if there’s always a few loanable books on a shelf then demand isn’t high enough to warrant buying more copies.
- The prices paid for books when acquired. This used as part of a process to work out a rough “average” price that an individual book for a work is likely to cost so that it can be compared to the remaining money in departmental budgets. Actual acquisitions may cost more or less obviously depending on the book, supplier discounts, etc, but previous acquisitions of a work give the purchase predictor code a good starting estimate (especially as book re-sellers have awful/non-existent pricing APIs).
Whilst Koha is an open source LMS with a well documented data structure and the ability to have new modules easily added, our instance will be hosted off site for us by PTFS-Europe. Therefore where possible it was decided to opt for existing Koha API access to the LMS from our purchase predictor code running on servers in Loughborough, rather than adding new, specially written code on their server to replicate the previous AIM scripts.
One option that Koha provides out of the box is the ability for library staff to produce their own reports using SQL SELECTs on the database, and then have these reports made available via a web service API. This reporting mechanism only supports SELECT statements, so it can only be used in a read-only manner, but that’s fine for much of what the purchase predictor needs to do. The results of the reporting scripts can be returned in JavaScript Object Notation (JSON) format, which is ideal for our Perl based scripts to ingest and process. The Koha reporting mechanism does allow parameters to be passed into the report SQL statement, so the purchase predictor can influence what each report generates (for example passing the ISBN of the work that it needs a loan history for, and the earliest loan date that we are interested in).
Most of the required data above can easily be retrieved from Koha using the above scripts. One that we do have to be careful about though is the loan history of works. Current and old issues in Koha are held in two, potentially very large tables and its very easy to accidentally write SQL that will start table scanning them as not all the columns required are indexed. SQL run from the Koha reporting system can bring the underlying MySQL server to its knees if the table scans start generating large amounts of memory use, which they might for popular works.
As an alternative to this is to use the Koha statistics table to pull the current day’s issues out, and store them in our own MySQL server on the LORLS host. We already did something similar to this for the Aleph server, and for much the same reason: using AIM scripts to live trawl through all the issue records could overload its Oracle server. Our MySQL loan history database is added to daily with the previous day’s issues and has indexes created so that subsequent queries for work loan history can easily and quickly look back a month or two.
We tested the resulting SQL queries and a slightly modified version of the existing LORLS purchase predictor code on the test/development Koha installation, with some migrated dry run test data imported from our live Aleph LMS. This appear to predict sets of suitable purchase suggestions for a variety of departments and modules. We’re now waiting for our Koha LMS to go live. No doubt this will still require a few tweaks in day to day use, but it does appear that the LORLS purchase predictor can be ported between LMSes.