{"id":1727,"date":"2014-01-06T14:52:24","date_gmt":"2014-01-06T14:52:24","guid":{"rendered":"https:\/\/copyright.lboro.ac.uk\/lorls\/?p=1727"},"modified":"2014-01-06T14:52:24","modified_gmt":"2014-01-06T14:52:24","slug":"integrating-lorls-with-koha","status":"publish","type":"post","link":"https:\/\/blog.lboro.ac.uk\/lorls\/koha\/integrating-lorls-with-koha","title":{"rendered":"Integrating LORLS with Koha"},"content":{"rendered":"<p>At Loughborough we use <a href=\"http:\/\/www.exlibris.co.il\/category\/Aleph\" target=\"_blank\">Ex Libris&#8217;s Aleph<\/a> as our Library Management System (LMS). Our LORLS reading list system also makes use of Aleph in a number of ways:<\/p>\n<ol>\n<li>To provide bibliographic information via Z39.50, allowing people to quickly add works to reading lists by simply entering the ISBN,<\/li>\n<li>In batch jobs to check which works the library holds, again using Z39.50,<\/li>\n<li>To find how many items for a work are held and how many are available for loan which is displayed on a work&#8217;s detail pop up in CLUMP. \u00a0This has been done using the Ex Libris X Server product (not to be confused with X11 servers!),<\/li>\n<li>To build up lists of current and past loans, used for purchase prediction, high demand reporting and usage tracking. \u00a0This is mostly done using custom local Perl scripts that interrogate Aleph&#8217;s Oracle database directly and then provide an XML API that LORLS code calls.<\/li>\n<\/ol>\n<p>We&#8217;ve recently had another site say they are interested in using LORLS, but they use the <a href=\"http:\/\/www.koha.org\/\" target=\"_blank\">Koha LMS<\/a>. \u00a0They asked us what would be needed to allow LORLS to integrate with Koha? \u00a0Luckily Koha is open source, so we could just download a copy and install it on a new virtual machine to have a play with and see what was needed.<\/p>\n<p>Looking up bibliographic information was pretty simple to support. \u00a0Koha has a built in Z39.50 server, so all we needed to do was to tweak the LUMP.pm file on our dev server to point the\u00a0Z3950Hostname(),\u00a0Z3950Port() and\u00a0Z3950DBName() to point to our new VM, the port that Koha&#8217;s Z39.50 server is running on and the new Z39.50 database name (which appears to default to\u00a0&#8220;biblios&#8221;). \u00a0That seemed to work a treat.<\/p>\n<p>Get item holdings for works was a bit more involved. \u00a0Koha obviously doesn&#8217;t have Ex Libris&#8217;s X server, so we needed another way to get similar data. \u00a0Luckily Koha does implement some of the <a href=\"http:\/\/www.diglib.org\/\" target=\"_blank\">Digital Library Federation<\/a> <a href=\"http:\/\/old.diglib.org\/architectures\/ilsdi\/DLF_ILS_Discovery_1.1.pdf\" target=\"_blank\">Integrated Library System &#8211; Discovery Interface<\/a>\u00a0recommendations. \u00a0One of these Application Programming Interfaces (APIs) is called GetRecords() and, given a set of system record identifiers, will return an XML document with just the sort of information we need (eg for each item linked to a work we get things like item type, whether it can be loaned, what its due date is if it is on loan, if it is damaged, lost, etc).<\/p>\n<p>Unfortunately LORLS doesn&#8217;t know anything about Koha&#8217;s system record identifiers, and the GetRecords() ILS-DI API doesn&#8217;t appear to allow searches based on control numbers such as ISBNs. \u00a0To get the system record numbers we can however fall back on Z39.50 again, which Koha uses to implement the ILS-DI SRU type interfaces. \u00a0Searching for a Bib-1 attribute of type 1007 with the value set to be the ISBN gets us some nice USMARC records to parse. \u00a0We need to look at the <em>999 $c<\/em> field in the MARC record as this appears to be the Koha system record identifier.<\/p>\n<p>Just to make things interesting we discovered accidentally that in Koha you can end up with more than one work for the same ISBN (and each work can then have multiple physical items). \u00a0I guess this is a flexibility feature in some way, but it means that we need to make sure that we get all the system record identifiers that match our ISBN from LORLS and then pass all of these to the Koha GetRecords() API. \u00a0Luckily the API call can take a whole set of system record identifiers in one go, so this isn&#8217;t too much of a problem.<\/p>\n<p>One thing we do have to have in the code though is some way of distinguishing between loan categories (long loan, short loan, week loan, reference, etc). \u00a0In Koha you can create an arbitrary number of item types which can correspond to loan categories, to which you can then assign things like differing loan rules. We slipped in:<\/p>\n<ul>\n<li>BK &#8211; normal long loan book (the default in Koha it seems)<\/li>\n<li>REFBOOK &#8211; a book that can&#8217;t be loaned.<\/li>\n<li>SL BOOK &#8211; a short loan book (usually loaned for less than a day &#8211; our &#8220;high demand&#8221; stock),<\/li>\n<li>WL BOOK &#8211; a book that can be loaned for a week (effectively moderately in demand works).<\/li>\n<\/ul>\n<p>Our code currently then has these hard coded in order to return the same sort of holdings Perl structure that Aleph did. Extra item types assigned in Koha will need to been inserted into this code &#8211; we might have to think of a &#8220;nice&#8221; way of doing this if folk make lots of these changes on a regular basis but I suspect item types are one of those things that are configured when an LMS is setup and rarely, if ever, tweaked again.<\/p>\n<p>In the StructuralUnit.pm Perl module I created a new method called\u00a0Koha_ILS_DI_Holdings() to implement the new Koha item holdings and availability code. \u00a0The existing Holdings() method was renamed to Aleph_Holdings() and a new Holdings() method implemented that checks a new Holdings() method in LUMP.pm for the name of a holdings retrieval algorithm to use (currently now either &#8220;Koha:ILS-DI&#8221; which selects the new code, or anything else defaulting back to the old Aleph_Holdings() method). \u00a0This means that if someone else comes along with XYZ Corp LMS that uses some other whacky way of retrieving holdings availability we can simply write a new method in StructuralUnit.pm and add another <em>if<\/em> clause to the Holdings() method to allow a quick LUMP.pm change to select it. \u00a0The advantage of this is that other code in LORLS that uses the Holdings() method from StructuralUnit.pm doesn&#8217;t have to be touched &#8211; it is insulated from the messy details of which implementation is in use (ooh, object oriented programming at work!).<\/p>\n<p>This appears to work on our test set up and it means we can just ship a new StructuralUnit.pm and LUMP.pm file to the folk with Koha and see how they get on with it. \u00a0Our next trick will be getting loan history information out of Koha &#8211; this may take a bit more work and its really replacing functionality that we&#8217;ve mostly implemented solely for Loughborough (using our custom scripts as even Aleph didn&#8217;t provide a usable API for what we needed). \u00a0It doesn&#8217;t appear at first glance that Koha&#8217;s ILS-DI APIs cover this use case &#8211; I guess we&#8217;re a bit odd in being interested in work loan histories!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>At Loughborough we use Ex Libris&#8217;s Aleph as our Library Management System (LMS). Our LORLS reading list system also makes use of Aleph in a number of ways: To provide bibliographic information via Z39.50, allowing people to quickly add works to reading lists by simply entering the ISBN, In batch jobs to check which works [&hellip;]<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","_links_to":"","_links_to_target":""},"categories":[2,3,4,5],"tags":[],"class_list":["post-1727","post","type-post","status-publish","format-standard","hentry","category-koha","category-lorls","category-lump","category-other-systems","count-0","even alt","author-cojpk","last"],"_links":{"self":[{"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/posts\/1727","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/comments?post=1727"}],"version-history":[{"count":0,"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/posts\/1727\/revisions"}],"wp:attachment":[{"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/media?parent=1727"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/categories?post=1727"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.lboro.ac.uk\/lorls\/wp-json\/wp\/v2\/tags?post=1727"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}