NISO Circulation Interchange Protocol
  • Home
  • NCIP News
  • About NCIP
    • The Protocol
    • Extensibility
    • Implementer Profiles
  • About NCIP Standing Committee
    • Meeting Minutes>
      • 2014
      • 2013
      • 2012
      • 2011
      • 2010
      • 2009
      • 2008
      • 2007
      • 2006
      • 2005
      • 2004
      • 2003
      • 2002
    • Members
  • Documentation
    • Introduction to NCIP
    • The Standard
  • Links and Resources
    • Related Standards and Initiatives
    • XML Processing Tools and Utilities
    • Presentations and Publications

September 22-24, 2009

NCIP Implementers Group
Minutes - In Person Meeting
September 22-24, 2009
Present:
Sue Boettcher - 3M
Susan Campbell - College Center for Library Automation (CCLA)
Rob Walsh - EnvisionWare (NCIP Maintenance Agency)
Mike Dicus - Ex Libris
Lynne Branche Brown - Innovative Interfaces (Tuesday, Wednesday only)
Karen Wetzel - NISO
John Bodfish - OCLC
Rob Corcuera - OCLC (Tuesday, Wednesday only)
Tony OʼBrien - OCLC (Thursday only)
Dan Iddings - PALCI (Tuesday, Wednesday only)
Rob Gray - Polaris
Gail Wanner - SirsiDynix (Chair; Wednesday, Thursday only)
DJ Miller - TLC
Minutes prepared and submitted by:
Rob Walsh, EnvisionWare / NCIP Maintenance Agency
Tuesday, September 22
Implementation Status Updates
Walsh, in Wannerʼs absence, opened the meeting and asked each attendee to provide 
a brief implementation status update.
* Polaris
* doing NCIP with 3M using a self-service model
* has tight integration with SirsiDynix URSA that has progressed over the past 
year
*looking to do more integration with Ex Libris and III in an ILL environment
* has some unique customers who want NCIP due to the security provided 
through HTTPS
* using NCIP with CybraryN and Overdrive
* all implementations are responders, and all are based on NCIP Version 1
* has some prototype code written for NCIP Version 2
* TLC
* have recently begun revisiting some older NCIP code
*testing with Auto-Graphics for a possible release in a few weeks
* have done some work with SirsiDynix for ILL
*implementation is a responder and is based on Version 1
1 of 26* OCLC
* has no concrete plans for Version 2
* has a number of products that act as initiators in ILL context
* working with Horizon, Unicorn, Symphony and Aleph in production 
environments
* working with Voyager (possibly in production)
* planning to work with Polaris and TLC
* 3M
*implemented Version 1 some time ago and have made no changes
* exchanging with Polaris and have some European implementations
* waiting for other people to support NCIP
* have no plans to implement Version 2 until more vendors support it
* Ex Libris
* Aleph and Voyager have responders
* both Aleph and Voyager are working with URSA and exchanging some 
messages with Relais
* all implementations are Version 1
* do not yet perceive a market need for Version 2
* PALCI
* using NCIP Version 1 at two or three Voyager sites and 7 or 8 SirsiDynix sites 
(with Unicorn and Symphony)
* migrating away from URSA and replacing with Relais
* working also with TLC
* has required all sites to be NCIP compliant as part of the URSA-to-Relais 
migration
* has encouraged III to develop an NCIP responder
Bodfish noted that OCLC has many consortia, and getting NCIP adopted across a 
consortia with multiple vendors can be quite challenging. He asked Iddings what 
compelling arguments convinced his vendors to implement. Iddings said that security 
was a big part of the decision. Telnet screen scraping is both insecure and difficult to 
implemet. Some campuses have shut down Telnet access on the local networks. He 
said that, originally, the prices quoted for NCIP implementations were very expensive. 
Everyone, though, has made it affordable. What has been surprising, he said, is that 
each vendor seems to have a unique NCIP version for talking with Relais that is 
different from what is used to talk to SirsiDynix. Bodfish explained that this is due in 
part to the huge variety and the many options that exist in ILS. Iddings asked how he 
can be confident that the new implementations will work. Bodfish replied that it requires 
testing. However, he added, we can attempt to fix the core profiles and that will bring us 
closer to having an expectation that two vendors can seamlessly interoperate. Gray 
said that expectations around how reserves are handled can create confusion. “When 
the reserve is placed or a book comes in that we are not expecting, are we to 
automatically create a record, or will the ILL system tell us to create the reserve? We 
have to implement different business rules with our NCIP implementations to integrate 
with different vendors.” Bodfish said that we need to surface these issues as we work 
through the core profiles. Iddings added that there are differences not just within the 
2 of 26various systems, but with library policies as well. Bodfish concluded that there needs to 
be some common understanding about what happens within the various systems that 
goes beyond the messages themselves.
Wetzel stressed the importance of working to address Version 2. “We cannot continue 
to maintain Version 1,” she said. Bodfish asked her to clarify whether she meant that 
Version 1 should no longer be used or whether the efforts of the Implementers Group 
should be focused primarily on Version 2. All existing implementations are Version 1, 
and they will continue to exist for some time. Wetzel explained that NISO feels that new 
efforts should be focused on Version 2, and implementers should be encouraged to 
develop support for Version 2. Iddings added that PALCI expects all of its 
implementations to be at Version 2 when the migration to Relais is complete. Brown 
said that the NCIP responder that Innovative Interfaces is working with PALCI to 
develop will be based on Version 2.
New Website
Walsh presented a new NCIP website that represents efforts to migrate existing 
information from the current website and to better integrate information that is housed at 
various NISO resources. The new website may be found at http://www.ncip.info. The 
existing website may be accessed via http://ncip.envisionware.com, and the new site 
has several links back to the older site. Those links will remain until all the information 
from the older site is migrated.
The new site is based on Drupal, and it will allow the information to be more easily 
updated. Further, content in the new site is directly linkable. In the older site, only the 
main page was accessible directly.
Bodfish asked whether the content of the new site was arranged such that, when 
EnvisionWare vacates the position of Maintenance Agency, the site could be easily 
moved and re-hosted. Walsh explained that, in theory, that would be possible since it 
would require moving the Drupal database to a another Drupal installation.
Core Profiles
After a brief review of the Agenda, the group began a discussion of Core Profiles. 
Bodfish asked “What are they and what are they meant to accomplish?” Walsh said 
that one of the goals was to build a high level set of documents focused on tasks rather 
than specific details. Bodfish emphasized that there will remain, though, a need for the 
low level details that enable on system to talk with another.
The group divided into two smaller groups, one focused on brokered DCB and another 
focused on self-service. The goal for each group was to identify the core tasks and 
functions that occur in each workflow in order to determine when specific 
implementations begin to diverge.
3 of 26The small groups reunited and reviewed the work each had done. Bodfish summarized 
the discussions from the DCB working group. They discovered that there are 6 basic 
steps: authenticate, request, fill, return, and complete. [Editorʼs notes: clearly I missed a 
step since I have only 5.] However, even at the highest level, this is much variation. 
Campbell asked whether the discussions really were at the highest level, or were they 
focused at a level with too much detail. Iddings said that at PALCI, there is much 
variability in library policy and process, yet they still manage to get it done with NCIP. 
Bodfish suggested that, with more time, he feels we could come up with a common 
understanding. “However, it would be full of variables and options.” Walsh asked 
Campbell whether, as a customer, a profile with many options would be useful. 
Campbell said that would put us back where we are now. She asked why it isnʼt 
possible to define a Core Profile around the 9 core messages. Bodfish said that, even 
with those messages, there are options. How we identify them varies from system to 
system, for example. Even when weʼve agreed on the message, there are so many 
choices, and both partners in an exchange need to implement them the same way. 
Using Accept Item as an example, who is supposed to notify the patron? That isnʼt 
clear in the protocol. Who creates the item record? Who puts the item on hold? These 
all require some interpretation of the standard, and we have to work with the other 
vendor to ensure that we agree on the interpretation. Gray added that it is up to the 
responder to make these decisions.
Campbell asked why vendors quote high prices when they need to expand an 
implementation to interoperate with another vendor with whom they have not worked in 
the past. Brown said that the problem is often a matter of not have a common 
language. “Each new partner,” she said, “may require a slightly different language.” 
Walsh said that he thinks of these as dialects. NCIP is the common language, but each 
implementation may use a different dialect. Bodfish added that these dialects may be 
as different as old and modern English. “This has created a difficult situation for 
customers because we canʼt specify in an RFP enough detail to assure that two 
arbitrary implementations will interoperate,” he continued. Iddings suggested that there 
should be a goal to improve the process so that implementations are “plug and play.” 
He reminded the group that it took a long time to get to a point where MARC records 
were a commodity. Gray said that Z39.50 is the same way. Campbell asked whether 
one of the goals of Version 2 was improved interoperability, possibly through the core 
message set. Bodfish said he didnʼt think that was a goal of Version 2. “It is a goal of 
the Implementers Group, and, while the core message set and Version 2 occurred at 
about the same time, they were not aimed at the same goals.”
Campbell suggested that the question “Can you do this task?” may need to change to 
“With whom can you do this task?” Gray said that, eventually, ILL implementations will 
go farther than those today. Some day, it should be possible for both parties to be 
initiators and responders so that the complete transaction can occur seamlessly and 
automatically between the two systems. Bodfish said that there may be a higher level 
with questions like “Do you do ʻtraditionalʼ ILL where one side makes a request of 
another?” Brown said that an earlier effort by Wanner, Jackson, Campbell, and herself 
came close to achieving what Campbell is requesting. (This document is titled 
4 of 26NCIP_RS_CoreTasks_6028-2009_LBB.DOC, and it is available at http://www.niso.org/
apps/group_public/document.php?document_id=2367&wg_abbrev=z3983maint.) 
Campbell agreed and asked why so much variability exists within the standard. “If 
minor changes are required in order to make two systems compatible,” she continued, 
“then those changes should be modest and reasonable.”
Walsh summarized the efforts of the self-service working group. He explained that, in 
many ways, self-service is simpler than resource sharing. As a result, there may be 
much less variability in self-service workflows. However, some self-service 
implementations may need only a single message, where others may use sequences of 
messages to complete a task. For example, self-service implementations that need 
only to determine whether a user is valid need only the Lookup User message. 
Automated materials handling systems (sorters) may need only Checkin Item. Selfservice circulation kiosks, though, may need Checkout Item, Checkin Item, Renew Item, 
Lookup User, and Lookup Item. Regardless, of the 9 core messages, it seems that only 
5 are truly useful for self-service. Further there are other common self-service 
functions, like paying fines (either as part of or independently from a circulation 
transaction), creating or updating patron records, and making deposits to library-hosted 
monetary accounts, that require messages outside the core set. Collectively, these 
issues complicate the question of what it means to support the core message set.
Change / Review Process
Walsh explained that in a recent ballot, the NCIP IG had approved a move from periodic 
to continuous review. ANSI still needs to officially approve the change, but Wetzel 
indicated that there should be no reason why it would not be approved. Walsh 
summarized the process that had been defined in the balloted documents. The group 
will be able to review defects and enhancements requests twice each year. These 
change requests need to be submitted in writing prior to a spring and a fall meeting. 
Ideally, new items will be raised during each conference call so that members are 
familiar with them prior to the meetings when they will need to be formally reviewed. 
The group agreed that we should have a form on the website that people may complete 
in order to submit a change request. When each item is reviewed, the group may 
decide to accept it without change, accept it with modification, accept it for further study, 
or reject it. (More information about the change from periodic to continuous review, 
along with a description of the process, is available at http://www.niso.org/apps/org/
workgroup/z3983maint/download.php/2337/ContinMaintRequest_z39_83.pdf and http://
www.niso.org/apps/org/workgroup/z3983maint/download.php/2336/
ContinMaint_Procedures_z39_83.pdf. These documents are available only to members 
of the NCIP IG at this time. This information will be made available publicly once the 
change is formally approved by ANSI.)
5 of 26As part of this discussion, the group identified the following criteria that will assist during 
the review of each change request:
* Overall value and usefulness of the requested change
* Compatibility with current version
* Changes that break backward compatibility are reserved for major version 
changes (Version 2 to Version 3, for example)
* Scope and relevance of the requested change
* Specific reasons, justifications, and use cases for the requested change
* Negative impacts associated with the requested change
* General applicability of the requested change
* Any field or trial usage of the requested change (through an extension, for example)
* A draft of changes necessary to the schema in order to support the requested change
Further, the group agreed that the versioning format for future revisions should be X.YY 
where X represents the major version and YY represents the revision or minor version. 
For example, the first version of the standard is 1.00. The first revision was 1.01. The 
current version is 2.00, and the next revision will be 2.01. The group agreed also that 
the “.00” is optional when referring to the initial major version. For example, Version 1 
and Version 1.00 represent the same version. Finally, as was noted above, changes 
that break backward compatibility must be introduced in a new major version. 
Otherwise, it may be assumed that, within a single major version, various revisions are 
compatible. This means that new data elements introduced in a minor revision may not 
mandatory since earlier versions would not know of their existence. Obviously, 
functionality that depends on new data elements would not be available in a system 
using an earlier revision, but the two systems should be able to interoperate with 
respect to existing functionality.
NCIP Test Bed and Presentations with Live Demonstrations
Gray suggested that is should be possible to create a simple web page that allows 
NCIP requests to be pasted as text into a field and sent to a responder. The 
responderʼs response could be displayed as text in another field. This would allow an 
implementer to manually verify that the message was actually exchanged and that the 
response was formatted as expected. Bodfish said that this would be a good start, but 
he was hoping that we could do more. The ISO-ILL “bake-off” is something that 
continues to come up around this topic. Something where the customers present the 
various systems (rather than having vendors demonstrating) would be even more 
powerful. Campbell said that customers who are considering purchasing, though, 
probably want to deal with the vendors rather than other customers. 
Walsh asked whether anyone has any concerns over this being a one-time event rather 
than something that would be continuously available. Bodfish said that IndexData has a 
way to execute a variety of Z39.50 requests and verify that they were executed properly.
6 of 26Wetzel suggested that we consider organizing an event at next yearʼs ALA (annual) to 
demonstrate the functionality available through the core message set. Campbell 
suggested that such an event would be a good way to follow up this yearʼs LITA 
presentation at next yearʼs LITA conference. Iddings noted, though, that ALA may be 
better because it has a broader audience. However, Campbell said, LITA has fewer 
events that compete for peopleʼs time. Wetzel suggested that we might do some kind of 
general promotion at ALA and follow if up with live demos at LITA.
Gray indicated that the protocol is not something that can be demonstrated effectively. 
“Thereʼs nothing more boring that watching a demo intended to illustrate the use of a 
protocol,” he said. Iddings added that the issue isnʼt that people donʼt believe that 
NCIP works. “It is more about what it means to be ʻcompatibleʼ or ʻcompliant,ʼ he said. 
“A demo is simply proof that you can interoperate.”
The group agreed to revisit the idea of events at next yearʼs ALA and LITA conferences 
during a future conference call.
PALCIʼs Move to NCIP
Campbell asked Iddings what the “ah-ha” moment was for PALCI members who were 
reluctant to move to NCIP. Iddings said that the move from URSA to Relais and a 
desire to move away from Telnet screen scraping were the primary motivators. 
Campbell asked when the staff realized that doing things with NCIP was better than 
doing them without. Iddings said that they were able to do the same tasks before NCIP, 
but the implementations based on Telnet were more fragile.
Bodfish noted that there may be three different issues related to demonstrate 
interoperability that need to be addressed separately:
* Not enough implementations to demonstrate interoperability
* Not enough push from libraries to force implementations
* Vendors not doing enough; high barriers to entry (possibly partially addressed with 
core messages)
Brown said that this discussion often rambles because we donʼt do enough when we 
leave the meetings. “Much of the real work needs to be done back in the offices.” 
Wetzel suggested that scheduling live promotions might encourage people to follow 
through. Iddings added that the LAMA/RUSA STARS forum at this summerʼs ALA had 
an impact on many of the directors from his member institutions. “They came back 
excited about what they heard discussed in that session.”
NCIP Compliance
Walsh asked what it means to be “compliant.” Wetzel said that, in the standard, 
compliance is the ability to exchange at least one message with at least one partner. 
Brown asked whether it is important in the marketplace to be able to assert compliance. 
7 of 26Bodfish said that it is a basic question that enables us to have another conversation. 
Campbell said that, from the standpoint of an RFP, there should be a series of 
questions: 
*“Do you do NCIP?”
*“With whom do you do NCIP?”
*“Is your implementation available now?”
*“If I want something that you canʼt do now, what will it cost me to get it?”
She added that compliance as it is currently defined is not really relevant because it 
doesnʼt answer those questions.
Bodfish said that NCIP may not be necessary in order to accomplish a task. If the 
question is “Can I do resource sharing between system A and system B?” then I might 
not need NCIP. However, if the question requires that I do NCIP, then I must answer 
differently. Walsh asked why it matters whether NCIP is involved if the overall outcome 
can be achieved. Bodfish said that there might be an expectation that, if NCIP is used 
today to perform the task, then future functionality through NCIP should be available at 
a lower cost. Wetzel suggested that we consider the definition of “compliance” in a 
future revision of the standard. Walsh said that we would need either a better, more 
commonly accepted definition or we need to educate the community to stop asking for 
“NCIP compliance” in RFPs. Bodfish said we should focus on the latter. “We need to 
educate people to be specific about their needs. You need to specify what system you 
have (and not ʻOCLCʼ but ʻWorldCat,ʼ for example), with whom you wish to exchange, 
and what particular task or workflow you want to perform.” 
Boettcher said that, in the end, this doesnʼt guaranty compatibility and interoperability 
because of the various options that are inherent in the protocol itself. Dicus asked 
whether there is a notion of “universal compliance.” Bodfish said that, with infinite time 
and resources, yes. Otherwise, it seems optimistic. Campbell asked whether “Are you 
compliant?” is a relevant question in an RFP. Dicus said that he feels it is relevant but 
not sufficient. “You need to know a lot more information before you know whether any 
two systems are interoperable,” he continued. “I donʼt imagine a point in the future 
where a simple ʻYes/Noʼ to ʻAre you compliant?ʼ is enough.” Bodfish said that the RFP 
needs to ask with what systems can you exchange, through what messages, and using 
what versions. Boettcher suggested that we should target educational efforts at 
consultants who provide assistance to libraries during the RFP processes.
Bodfish and Gray volunteered to review and revise the existing RFP guidelines with an 
intent to target consultants and to provide suggestions for how to ask questions that will 
provide useful and meaningful responses. Walsh suggested that Koppel has often 
expressed an interest in working on this document and that he should have the 
opportunity to participate in this effort.
Cambell and Brown volunteered to create a chart for implementers to use to indicate 
which core messages they support, with whom they are exchanging each message, 
8 of 26what versions are supported, and what role is played. Jackson was included in this 
effort since she has worked with Campbell and Brown on similar efforts in the past.
Recall Item
The group discussed whether Recall Item belongs as part of the core message set. 
When the core message set was defined in the April 2009 meeting, 8 of the core 
messages were identified by the number of implementations that used them. However, 
Recall Item was not widely implemented, and no one in the group could remember what 
justification was provided for including it in the core. Many in the group felt that Recall 
Item is not as generally useful as the other 8 messages, and some have 
implementations that include all the messages in the core except for Recall Item. 
The group reviewed other messages in the core set in an attempt to determine if any 
presented the same challenges as Recall Item. Gray indicated that Accept Item can 
sometimes be tricky because it may either represent simple item information or a more 
complex request that requires several steps to complete. Iddings asked whether 
Renew Item supports the ability to have a patron make the renew request in the system 
from whom the patron picked up the material even though that may not be the system 
that owns the item, particularly in the context of ILL. The consensus of the group was 
that systems can and do support this functionality with Renew Item.
Wednesday, September 23
Review Position on Core Profiles
Walsh asked the group to review the discussion on Core Profiles and decide whether 
more time and effort should be invested in them. Brown said that she wasnʼt sure they 
will help much. “They may not get us what we hope to achieve,” she said. “We want a 
consensus on the details of message exchange, but thereʼs too much variation in those 
details to make a profile userful.” Wetzel asked how profiles are being used now. 
Bodfish said that there are no Core Profiles today. “We use Application Profiles to 
describe how our applications use NCIP.” Brown added that to try to generalize the 
Application Profiles seems like a daunting task since we continue to run into variations 
very early in those discussions. 
Bodfish suggested that the goal of a Core Profile should be to help new implementers 
discover what they should do. However, he continued, it would not be one profile for all 
ways to do resource sharing. Instead, it might be a set of practices that describe in 
general terms how to perform high level tasks: this is how to do a hold, here are ways to 
do something else. Maybe instead of Core Profiles, a set of recommended practices 
might be a better goal. Campbell said that Application Profiles seem to address the 
needs of the developers. “A recommended practices document,” she said, “would be at 
a higher level and aimed at those who may not understand the nuts and the bolts.” 
Bodfish agreed that they would be more like educational tools. Brown suggested, then, 
that they would need to be called something different. Campbell compared this new 
9 of 26type of document to the “Roadmap to NCIP.” Bodfish said that, different from the 
Roadmap, though, this document should address specific tasks like resource sharing 
and self-service circulation and define how they are done in NCIP.
Wetzel suggested that we might need one document to address the needs of the 
customer and another to address the needs of new implementers. Campbell said that 
the document for customers should be written such that a library administrator will 
understand it. “This might be a good way to explain how the core message set came 
into being. Why those messages and how are they used to deliver 80% of the 
functionality necessary,” she said. Bodfish asked what such a document (or pair of 
documents) will do for the NCIP standard. “Will it bring more customers, attract more 
implementers, or allow customers to be more effective in selecting solutions that work to 
address their needs?” he said.
Campbell noted that some of the existing documentation (and NCIP itself) sometimes 
seems too opaque. “People donʼt know why they want or need it. They are told they 
need it because it exists,” she said. “Something explains what NCIP is and how it can 
be used would help to educate. I go back to the Roadmap - maybe that isnʼt the right 
title, but something with similar content that explains what, why, and how could be 
useful.”
Bodfish suggested that the appropriate title might be “Getting Started with NCIP.” He 
indicated that it would explain how resource sharing and self-service work and where to 
go to get more information and assistance. Iddings said that the document should 
include the list of core messages since that helps to show why it isnʼt necessary to 
implement 45 message pairs in order to make NCIP effective. Bodfish said that we will 
need to be careful that the document does not suggest, for example, that there is only 
one way to build a resource sharing application. “Using a brokered application to initiate 
recall item, for example, doesnʼt make sense,” he continued. “The patron is not going to 
use the interface of the broker to recall an item.” Walsh asked whether this might be a 
“recipes” style of document. Bodfish said that those documents often fail to provide any 
narrative or cohesive story that explains what they are trying to accomplish. “This 
needs to be something that gives a customer enough information to have an intelligent 
and effective conversation with a vendor,” he said.
Bodfish volunteered to draft a very early, very thin document that demonstrates the type 
of content and style he feels is appropriate for this effort. He plans to focus on resource 
sharing. Walsh agreed to do take Bodfishʼs draft and do the same for self-service. The 
goal is to be able to publish these documents at next yearʼs ALA Midwinter conference. 
The group further defined a goal statement and a set of objectives and other information 
to guide these efforts:
Goal: To prepare a document that gives a customer enough information to select a 
product that delivers on his or her goals and objectives for a specific functional area 
(resource sharing, self-service, etc.)
10 of 26Objectives:
* Provide an overview of resource sharing (or other functional area)
* Describe how to think about NCIP
* Describe how the protocol helps to achieve the goals
*Identify where variations may complicate the issues
*Identify what other sorts of complexity might arise
* Outline the importance of communicating and discussing issues in detail with 
implementers
* Explain the significance of NCIP roles
*Identify some of the things that NCIP does not do so that customers will have 
reasonable expectations about what manual or external processes may still be 
required
* Encourage customers to review the RFP Guidelines
Target Audience:
* Primary - customers wanting to select a system that uses NCIP to accomplish a 
desired task
* This includes consultants who aid in this process
* Secondary - new implementers wanting to develop NCIP systems
Publication:
* To consultants who help libraries select products
* To product managers at implementers who develop NCIP systems
* To libraries who might be thinking about selecting a product that uses NCIP
Theory of Use:
* Customer reviews the document
* Customer goes to the NCIP website to determine what support exists from current 
vendors (using an “Implementer Discovery” chart)
* Customer reads the RFP Guidelines and prepares the RFP
* Customer sends the RFP to potential vendors
* Customer reviews Application Profiles for specific vendors and functions desired in the 
system being considered
* Customer discusses details with implementers
Expanding the NCIP Implementers Group
Iddings asked who else should be part of this group. He suggested possibly reaching 
out to open source vendors, eXtensible Catalog (xC) project members, LibraryThing, 
Overdrive, IndexData, and CybraryN. Bodfish about the strategies we should use to 
attract these other vendors. “With some,” he said, “weʼve already tried and weʼve not 
been successful. Are we trying wrong, or are they really not interested in being part of 
this group?” Boettcher said that some may perceive this to be an open and 
implementable standard that does not require any level of involvement or participation 
with the body who maintains it. One does not, for example, join the w3c in order to 
send HTML. Wetzel said that there is a difference between joining the group and 
11 of 26interacting with the group. “Perhaps,” she said, “we should encourage a level of 
interaction short of full membership.” Bodfish suggested that, in that context, we may 
have already been successful with people like Overdrive and IndexData. They are 
getting the information they need to do what they want. Maybe we should encourage 
vendors to designate a contact person. That would help us keep them informed. In 
addition, we should encourage them to let us know of their implementations and their 
successes.
Implementer Discovery and Status Reports
Gray asked how implementers keep their status information up to date. Walsh said that 
implementers should send any updates to him, and he will make the necessary changes 
to the appropriate section of the website. He added, though, that this does not make it 
easy for us to learn about people who are implementing but who are not part of this 
group. Wetzel suggested that we create an on-line form that anyone could complete to 
report on their implementations. Iddings suggested that a Facebook presence might 
attract more people. Wetzel said that we need something aimed at implementers that 
describes what we would like for them to do to help keep us informed of their efforts. 
Bodfish added that we should reach out to those whom we even think have an 
implementation and make it easy for them to provide us with helpful information.
Core Message Implementations
Brown passed around a form for each implementer to complete to indicate which core 
messages are supported. She explained that this is a start of what she and Campbell 
are working on for identifying what support each implementer has in various products. 
The group spent a few minutes completing the forms and returned them to Brown.
Enhancement Requests
Bodfish indicated that OCLC would like to be able to send multiple item ids as 
alternative ways to identify an item. The group reviewed comments that OCLC included 
with their vote on Version 2.
Request Item does not allow a combination of Bibliographic Id and Item Id, and it 
does not allow either to be repeated.
Walsh said that, in the past, we have identified a potential for ambiguity where the 
message could be construed to mean either ANY of the item ids or ALL of them. 
Bodfish suggested that there might be a need for a flag that designates which rule we 
want to the responder to apply. Gray said that these changes might have a large 
potential impact since many messages might ultimately have this ANY OF capability. 
Bodfish said that OCLCʼs request is specific to Request Item. “In other cases (like 
Accept Item), the item has already been uniquely identified and a single, known id is 
available,” he said. “I think we need to find a community of users who are interested in 
12 of 26this kind of request and see if the details can be fleshed out.” Walsh indicated that 
Wanner has expressed an interest in similar functionality in the past.
Bodfish introduced a second request, also from OCLCʼs vote on Version 2.
Bibliographic Record Id is not repeatable in Bibliographic Description of Accept 
Item. Why canʼt an initiator send an OCLC number and an LCCN if it has both 
for the record?
Bodfish explained that OCLC wants to share all the information that it has about an 
item, possibly as an aid to collection development efforts. He said that there is no 
expectation that the information will become part of any public records or indices.
Bodfish agreed to have off-line conversations with interested persons, to draft an 
enhancement request, and to submit it in the future for consideration by the group.
Changes that were Dropped as part of the Version 2 Efforts
Bodfish reminded the group that there were changes that we wanted to incorporate in 
Version 2, but, due to constraints imposed during the final editing phase, we were not 
able to do so. Walsh agreed to review various notes from that period and compile a list 
of items that were dropped.
Review Core Workflows
Wanner said that she was expecting the Core Workflows to be an educational piece that 
allows customers to outline what they want to accomplish. Further, she said she thinks 
those documents should be something implementers could use to identify what 
messages (and what data elements) they support. “As a vendor,” she said, “if I knew 
what other implementers support which messages and services, then I know how I can 
interoperate with them without having to read lengthy vendor profiles.” Walsh said that 
sounded similar to the chart Campbell and Brown agreed to create. Bodfish suggested 
that we could add some data to the chart to identify the specific data elements that are 
supported. Campbell said that the existing Application Profiles might be more 
maintainable if the sections that describe the messages and data elements used could 
be linked to the chart in some sort of data driven fashion. Wetzel asked if it would be 
practical to make the entire profile into a dynamic electronic document rather than a 
static PDF. Bodfish said that there are several sections, though, that require lengthy 
text elements, and those may not lend themselves to electronic delivery. Most in the 
group agreed that pushing to get the data-specific portions into an on-line and electronic 
form is reasonable. Trying to put the text section in the same form, though, may be 
difficult.
Wanner suggested that we need some kind of form that vendors could fill in to indicate 
which messages and data elements are supported. “Many programmers,” she said, 
“tend to ignore the more verbose text sections and focus solely on the messages, data 
13 of 26elements, and state tables.” Gray said that we need to keep Version 1 and Version 2 
information separate with respect to support for various messages. “It needs to be 
clear,” he said, “whether a responder is supporting only one or both versions.” Walsh 
asked if the form Brown and Campbell are designing could be expanded to include this 
information. Bodfish said that, given the depth of the schema, trying to create a form 
that replicates that depth may be challenging. Wanner suggested that we start with the 
messages and the top-level data elements. Gray agreed that, if we take care to identify 
the “gotchas,” then such an approach seems reasonable. Wanner said that it is often 
easier to steer programmers to something that is complete rather than pointing them at 
separate, incomplete documents. She volunteered to extend the form Brown and 
Campbell are preparing and provide something that takes the information to the next 
level of detail. Gray agreed to assist by addressing the responder side of resource 
sharing, and Boettcher agreed to undertake a similar effort for self-service. Bodfish 
recommended that Wanner, Gray, and Boettcher make note of areas where they see 
potential implementation challenges as they compiles the information.
LITA Presentation Review
Wanner, Campbell, and Brown provided a preview of the presentation scheduled to be 
given at this yearʼs LITA conference (October 2, 2009). The group provided feedback 
and suggested changes.
Review Plans for Next ALA and LITA
Walsh provided a brief summary of the discussion around the NCIP test bed and the 
idea about having a live demonstration or “bake-off” style session. He explained that, 
because of various technical challenges with an always-available test bed, the group 
had focused an idea to have an educational session at ALA, followed by a “bake-off” 
style demonstration at LITA. Wanner expressed a concern that many in the ILL 
community do not attend LITA. Walsh suggested that we could plan the sessions so 
that the ALA presentation could be aimed at those interested in ILL with the hopes that 
they would encourage the more technical people in their organizations to attend the 
demonstration at LITA. Wanner suggested that we should consider holding an event at 
a future resource sharing forum.
The group then revisited the idea of a test bed. Ultimately, the group reached the 
conclusion that, in many ways, an accessible test bed is as much or more work than the 
current approach we use to facilitate implementer-to-implementer testing.
Demonstration of Prototype for Capturing Implementer Message Exchange
Gray demonstrated a sample XML format for capturing, analyzing, and reporting the 
structure of NCIP messages exchanged between implementers. The goal of such a 
system would be to provide a foundation or database of information that might be used 
with the implementer registry. The group identified significant potential in this sort of 
approach, particularly as a fairly easy way to document sample messages that are 
14 of 26exchanged under certain circumstances. The group discussed some of the challenges 
and caveats. Wanner indicated that, while this seems like a very useful approach, there 
may still be a need to wade through the various samples and locate similarities and 
differences between the same messages exchanged between different partners. 
Bodfish said that he has a stylesheet that will go through XML messages and generate 
a spreadsheet to analyze sets of messages against which it is run. It could be used to 
generate a display that shows which products send which messages with what data 
elements.
Gray volunteered to continue work on the prototype, providing appropriate annotations 
where they are necessary. Bodfish stressed the importance of protecting patron 
confidentiality and privacy, and he strongly suggested that each vendor provide samples 
with “clean” data. Boettcher showed a document 3M sends to other implementers 
outlining the messages sent and the expected responses. The document includes 
actual XML samples, and Boettcher indicated that it would not be hard to extract 
portions and put them into the format Gray demonstrated.
Thursday, September 24
Update on Recall Item
Walsh read a message Jackson sent in response to an inquiry about the reasons why 
Recall Item is in the core message set. Her note indicated that she does not recall the 
specific details, but she does believe it is “useful to include in the core message set.” 
Boettcher said that, as a self-service implementer, Recall Item is not useful. “However,” 
she continued, “there are other messages that are not in the core message set that 
would be beneficial.” Wanner asked whether we should work toward a core set 
specifically for self-service in addition to the current core set which may be more aimed 
at resource sharing. Boettcher said that she believes that we should continue to 
promote the core set as it is, then encourage implementers to support an additional 5 or 
so messages that provide extra functionality (fines payment, undo checkout, etc.). 
OʼBrien suggested that we make a point to revisit Recall Item and try to better 
understand why it is part of the core message set. This discussion should take place at 
a time when Jackson and Stewart can participate since they were the ones who lobbied 
for its inclusion at the April 2009 meeting. Walsh and Bodfish each agreed that an item 
should be placed on the agenda for the next conference call.
Campbell said that, if we leave Recall Item in the core, but the Implementers Group 
generally feels that it is unnecessary, then we may be back to where we were before the 
core message set was defined. If a library administrator specifies support for the core 
set, but he or she gets support for fewer than all 9 messages, then he or she is likely to 
be frustrated. 
The group discussed the idea of defining a second set of additionally useful messages 
that provide functionality over and above what is available with the core set.
15 of 26Bodfish said that part of the problem is that we donʼt have a clear and agreed upon 
definition of what the core message set is. We might decide that other messages in the 
core are not necessary to provide 80% of the labor saving benefits of NCIP. Cancel 
Request, Renew Item, and Lookup Item, for example, may not truly be necessary to 
achieve a level of automation that provides real benefit.
Gray asked whether an implementer with support for 8 messages would need to add 
support for the 9th in order to say “We support the core message set.” In his specific 
case, there is no functionality that requires support for Recall Item. He could implement 
a response, but the response would always indicate that the Recall Item request could 
not be processed. Campbell suggested that the core message set could be defined as 
8 messages for use in public environments where Recall Item is rarely or never used 
and 9 messages in academic environments where Recall Item may be both useful and 
required. OʼBrien noted that the point of the core message set was to define a minimal 
implementation. “It was not meant,” he added, “to be 9 messages plus or minus one.”
The group continued discussing this point and concluded that, for the core message set 
to be useful for defining a minimal implementation, saying “Yes, I support the core 
message set” means that support is available for all the messages in the core. In the 
context of an RFP, this might mean answering a question like “Do you support the NCIP 
core message set?” with one of the following responses:
*“Yes, we support all the messages in the core message set except for Recall Item 
because Recall Item is not necessary in the context of our application.”
*“No, we do not support all the messages in the core message set because Recall Item 
is not necessary in the context of our application.”
The group discussed the differences between how a responder uses the core message 
set and how an initiator uses it. OʼBrien and Bodfish suggested that it should be 
possible for an initiator to claim support for the core set if all the messages necessary to 
perform the tasks the initiator can perform come from the core set. In other words, if an 
initiator who does checkout item uses the NCIP Checkout Item message, then the 
initiator supports the core set. This helps to address the situations where self-service 
applications may never need to support messages like Accept Item. Responders, 
though, need to support all of the messages in the core.
The group agreed that further discussions are necessary, both to address how initiators 
may be able to claim support for the core message set and to revisit the composition of 
the core message set itself.
16 of 26Conference Call with Members of the eXtensible Catalog (xC) Project
Present on the call:
Jennifer Bowen
Randy Cook
David Lindahl
??? [Editorʼs note: I did not catch the name of the third participant from xC]
Dan Iddings
Bowen summarized the work of the xC project. “We are developing open source 
software for libraries under various Mellon grants,” she said. She further explained that 
they are developing a suite of applications that may be used separately or together. 
The applications can harvest data, normalize it, aggregate it, and provide a UI built on 
Drupal so that libraries have the option to implement their own OPAC. There are four 
different toolkits: OEI, Metadata Services, Drupal front-end, and NCIP toolkit. The 
NCIP toolkit provides an interface to the ILS patron database, primarily for user lookup 
and account information. They are working to provide some circulation functionality, 
primarily for C-ILL. They will have some web services that provide other functions. In 
order to use the NCIP toolkit, a customer need ILS-specific software (called drivers), 
and there are xC partners who are building those components. They provide extensive 
documentation that enables Java developers to build these ILS-specific drivers. They 
hope to be able to share the drivers with other customers via the xC website. They 
implemented only a subset of the NCIP protocol. They meet level three of the ILS-DI 
task force requirements, and they have defined some additional messages to provide 
additional functionality (xcGetAvailability to get circulation status for a screenful of items 
and xcLookupUser to get patron information needed for ILS-DI but not exposed via 
NCIP). Other extensions to NCIP include improved error reporting (available in “loose 
mode”; strict mode uses only what is available within the NCIP standard). The NCIP 
toolkit is available via Google Code, and it includes sample Voyager and Aleph drivers. 
Cornell has added support for four additional NCIP messages in order to provide 
functionality specifically for their own environment. Ultimately, the level of support for 
the various messages depends on what data and services are available from the 
backend system.
Bodfish said that he has downloaded the code and was very impressed. “You should be 
proud of what you have accomplished,” he said. He asked the xC members what the 
NCIP IG can do to help make some of the extensions part of the standard. The xC 
group explained that they did not know what the process is for requesting changes, but 
the functionality that they added was necessary and were considered to be showstoppers. Further, they found the error reporting facilities built into the standard to be 
insufficient for their purposes. Bodfish indicated that we will be moving to a process for 
continuous maintenance, and that will permit us to accommodate enhancements more 
frequently than on a five year cycle. Bowen volunteered to compile the various 
extensions the xC project has implemented, including some information about why they 
were needed and a specific contact person, and send those to the NCIP IG.
17 of 26Wanner indicated that we would be interested in having a representative from the xC 
project as part of the NCIP IG, and we are interested in future collaboration, either inperson or by phone. Bowen said that they will keep that in mind for the future. What 
they are doing is working well for them right now. Wetzel suggested that they might 
want to subscribe to the NCIP community interest list.
A member of the xC project asked whether the NCIP IG tracked other open source 
NCIP projects. Bodfish said that we try to keep track, and we have heard that both 
Koha and Evergreen may be doing open source NCIP, but we havenʼt seen anything. 
“They are partners with us, and we think they are using our toolkit,” replied the xC.
The xC project asked about the availability of a tool that would permit them to validate 
that their NCIP is correct. OʼBrien asked whether the project uses any tools to parse 
the schema and generate Java code from it. “This helps to ensure that the messages 
conform to the schema,” he said. The xC, though, is interested in more than just simple 
message conformance. “We want a sample or test client that ensures that the entire 
NCIP conversation is handled correctly.” Bodfish said that all we have now are 
recommendations for tools that will help to ensure the messages are built properly. “I 
canʼt imagine a more important missing piece,” responded the xC. Wanner said that, 
today, most of that testing is performed between specific implementers. Wetzel added 
that, although weʼve talked about a test bed, weʼre finding it to be a very difficult 
undertaking. “Weʼre thinking now that it might be more practical to encourage vendors 
to make test systems public,” she said. Wanner suggested that there are probably 
several implementers who would be willing to test with the xC project. Bowen agreed to 
send Wanner some information that would allow initiators to test with the xC responder. 
Bodfish suggested that we add something to the NCIP website that allows implementers 
to find potential testing partners.
The xC asked if the NCIP IG would be able to help them find partners for developing 
drivers for additional ILS systems. Bowen added that they are specifically interested in 
support for SirsiDynix. Iddings said that there are members within PALCI using Unicorn 
and Symphony who might be willing to collaborate. Bowen also indicated that, while 
they have a partner using III with Orcale, they are interested in finding a III site using the 
non-Oracle version.
OʼBrien asked what version of NCIP the xC project is using. Version 1 was the 
response.
Walsh asked how the developers who are building the ILS-specific drivers are getting 
the necessary technical information. The xC project indicated that the Voyager 
implementation uses a combination of direct calls to the database and, only when 
necessary, screen scraping. “We hope to upgrade our interfaces as ILS vendors 
provide more standard ways to access the systems.” They added, too, that their 
documentation provides a good guide for how to write drivers for additional ILS. 
Wanner asked if they will rely on the open source community to provide updates as 
18 of 26necessary to accommodate ILS updates and changes. “Yes, we feel that the 
community will continue to adapt and provide support for ILS changes.”
The groups thanked each other for participating on the call and agreed to exchange 
information that will allow us to keep each other informed of future progress and 
opportunities.
Bodfish said that it seems that we can help them and others like them if we make sure 
they know about out conference calls and meetings, help them find testing partners, 
clarify the defect reporting and enhancement request processes, and educate them 
about the various development tools that are available. Wanner added that the xC 
project underscored the challenges associated with testing implementations. Bodfish 
said that may mean that it is valuable to do a simple website tool like Gray suggested 
where XML messages could be pasted and sent and responses received and reviewed, 
just to ensure that the conversation occurred properly.
Social Event at ALA 2010
Wanner asked if we want to organize another social event at a future ALA. Boettcher 
asked how successful the last one was. Walsh replied that it was not particularly well 
attended. Bodfish added that we failed to attract anyone from the target audience: 
librarians. Wanner suggested that it might require us to establish a pattern at both 
midwinter and annual in order for the event to be successful. “We didnʼt do enough 
promotion,” she added. Campbell said that it definitely needs to be promoted earlier 
and on the list serves that librarians use. Wanner asked whether it would be more 
successful if it were sponsored. Campbell said that she doesnʼt believe that makes a 
very big difference. She suggested that the Next Generation Catalog for Libraries 
(NGC4LIB) list serve would be a good place to promote, as would the LITA-L list serve. 
Bodfish suggested Code4Lib. Wanner said that we could probably get the LAMA/
RUSA group to post something.
Wanner then asked about organizing an educational session at ALA, and she suggested 
that we discuss it in a future conference call. Wetzel indicated that we have until April 
before we would need to book a room with ALA. She then asked whether we should 
begin planning an event a LITA 2010. Wanner suggested that we revisit the idea after 
this yearʼs LITA conference is complete.
Next Meeting
Wanner asked where we should hold our next meeting. Walsh reminded the group that 
the other two sites that had offered to host this one were Ottawa (Relais) and 
Baltimore / Washington, D.C. (NISO). Gray reported that he was investigating whether 
Polaris would host. Wanner indicated that would be very convenient for doing 
something with the eXtensible Catalog project. OʼBrien offered that OCLC would host 
again. The group agreed to wait until Gray had information from Polaris before making 
a decision.
19 of 26Implementation Profile for using REST
Bodfish indicated that OCLC is interested in working on a new NCIP Implementation 
Profile that would use REST over HTTP. The Jangle project (http://www.jangle.org and 
http://code.google.com/p/jangle/wiki/NCIPXMLSchemaForBorrowers) seems to be a 
step in this direction. Gray asked whether REST is secure. Bodfish said that it can be 
delivered via HTTPS. Wanner asked if the idea is to require support for REST or to 
simply make it possible to use REST. Bodfish indicated that it could be added to the list 
of supported transports, or it could be incorporated into a new Implementation Profile. 
Implementers would be able to choose whether to support REST either in addition to or 
in place of currently supported transports. Bodfish said that OCLC believes this to be a 
good idea because there are many tools available to facilitate REST. Also, just like 
HTTP, REST is better understood by firewalls and by network administrators. It is 
sometimes better handled by routers. It is not likely to make any existing 
implementations easier. Instead, it would be more difficult since implementers would 
have to add support for a new transport. However, it might make implementation less 
objectionable to some customers and new implementers. Bodfish added that some 
tools may be able to generate code to support REST and the current transports at the 
same time. Wanner suggested that we might encourage implementers to use REST 
with Version 2.
Bodfish agreed that OCLC will draft something like a proposal when they are ready to 
move forward and post it to the list so that it can be discussed in a subsequent 
conference call.
Adjournment
Wanner asked if there were any other new business items for discussion. Hearing none 
offered, she adjourned the meeting.
20 of 26Appendix A - Action Items
What Who By When
Publish Continuous Maintenance 
procedures
Walsh After ANSI approval
Discuss possible ALA and LITA 
presentations
Group October conference 
call
Revise RFP Guidelines Bodfish, Gray, 
Koppel
ALA Midwinter 2010
Prepare chart for implementers to 
indicate which core messages are 
implemented with which partners; 
include version supported and 
NCIP role
Campbell, 
Jackson, Brown
TBD
Prepare a very rough draft of a 
“Getting Started with NCIP” 
document focused on resource 
sharing
Bodfish Oct 9, 2009
Prepare a “Getting Started with 
NCIP” document focused on selfservice
Walsh Dec 7, 2009
Prepare finished versions of “Getting 
Started with NCIP”
TBD ALA Midwinter 2010
Review notes from Version 2 editing 
phase and compile list of changes 
that were dropped in order to 
determine if any need to be 
resubmitted as enhancement 
requests
Walsh November 
conference call
Extend the core messages support 
chart to add detail about data 
elements
Wanner, Gray 
(resource 
sharing), 
Boettcher (selfservice)
TBD
21 of 26What Who By When
Extend the message analysis 
prototype in an effort to build a 
system for identifying which systems 
send what messages with what data 
elements
Gray TBD
Discuss Recall Item and whether it 
belongs in the core message set
Group October conference 
call
Select the site for the spring meeting Group October conference 
call
22 of 26Appendix B - Photos of Flip Charts Used During the Meeting
23 of 2624 of 2625 of 2626 of 26
Powered by Create your own unique website with customizable templates.