Library Technology Guides

Documents, Databases, News, and Commentary

Perceptions 2010: An International Survey of Library Automation

by Marshall Breeding. January 27, 2011

Launch the interactive version of the survey's statistical results

Introduction

In this time of tight budgets where libraries face difficult decisions regarding how to invest their technology resources, it’s helpful to have data regarding how libraries perceive the quality of their automation systems and the companies that support them. This report, based on survey responses from over two thousands libraries, aims to give some measure of how libraries perceive their current environment and probes at their inclinations for the future.

Some libraries may refer to the results of this survey as they formulate technology strategies or even consider specific products. I urge libraries not to base any decision solely on this report. While it reflects the responses of a large number of libraries using these products, I hope this survey serves more as an instrument to guide what questions that a library might bring up in their considerations and not to make premature conclusions based on subjecive responses. Especially for libraries with more complex needs, it’s unrealistic to expect satisfaction scores at the very top of the rankings. Large and complex libraries exercise all aspects of an automation system and at any given time may have outstanding issues that would naturally result in survey responses short of the highest marks.

The survey results also aim to provide useful information to the companies involved in the library automation industry. While each company likely performs its own measures of client satisfaction, this survey may show perceptions in different ways. I hope that the rankings in each category and the published comments provide useful information to help each of the companies hone in on problem areas and make any needed adjustments to their support procedures or product directions.

This year marks the fourth time that I’ve carried out this survey. Each year I have received around 2,000 responses, and a few more libraries responded this year than in any of the previous iterations. In very broad terms, the survey results are similar this year, but with some interesting new trends.

Top survey findings

  • Apollo, developed by Biblionix topped the rankings in ILS satisfaction and company satisfaction, and second best ILS Support. This product topped all of the satisfaction in last year’s survey results. Most libraries adopting Apollo and have migrated from abandoned products such as Winnebago Spectrum and Athena or or are automating for the first time. Apollo finds use exclusively in small public libraries.
  • This year two open source ILS products earned top marks. OPALS, targeting K-12 school libraries and Koha when supported by ByWater Solutions.
  • OPALS, an open source ILS created and supported by MediaFlex also gave stellar performance, ranking a tiny notch below Apollo in ILS satisfaction and company satisfaction and received top rankings in support satisfaction.
  • ByWater Solutions did the best in the company loyalty department, placed third for ILS satisfaction, support satisfaction, and company satisfaction. ByWater Solutions provides support services for the Koha open source ILS.
  • Products that ranked highest in earlier years of the survey, including and Polaris from Polaris Library Systems and VERSO from Auto-Graphics, continue to receive satisfaction scores just as high as before, but fall below the superlative marks given by libraries involved with Apollo, OPALS, or Koha as supported by ByWater Solutions.
  • Companies and products serving large and complex library organizations and diverse library types receive a broader range of responses, and fall into a middle tier of rankings. Yet where they fall within this middle ground represents important differences. Millennium from Innovative Interfaces, Library.Solution from The Library Corporation, and Evergreen from Equinox Software, and came out as very strong performers at the top of this middle tier.
  • Except for the libraries already using one, the survey reflected fairly low levels of interest in migrating to an open source ILS, even when the company rates their satisfaction with their current proprietary ILS and its company as poor. Other than libraries already running an open source ILS, and for Voyager (5) and Aleph (1), the mode score from libraries using proprietary ILS products was 0. Though the open source interest scores were low, a substantial portion of libraries that registered some interest in moving to a new ILS named open source products among the replacement candidates.

General Information about the Survey

This report describes the results of a survey that I conducted to gather data regarding the perceptions of libraries toward their automation systems, the organizations that provide support, and the quality of support they receive. It also aims to gauge interest in open source library automation systems.

I conducted a similar surveys in 2007, 2008, and 2009.

This year, I received 2,173 responses from libraries in 60 different countries. The countries most strongly represented include the United States (1,690 responses), United Kingdom (92), Canada (117), Australia (76) and New Zealand (44). As with the general demographics of the lib-web-cats database, the respondents of the library primarily come from libraries in English-speaking countries. Survey results were gathered between December 8, 2010 and January 19, 2011 (Full demographic summary).

The survey attracted more responses from libraries using Millennium (395), Symphony (282), Horizon (185), Voyager (124), OPALS (106), ALEPH 500 (106), Library.Solution(105), Polaris (104), Apollo (84), VERSO (72). There were fewer than 70 responses for each of the other ILS products represented in the survey. Systems with less than 20 did not appear in the main statistical tables. These responses can be seen through the individual ILS Product Reports available.

This article is an original publication of Library Technology Guides and is not slated to appear in any print publication. Please direct any comments or enquiries to the author.

This survey and its analysis reflect my ongoing interest in following trends in the library automation industry. It is designed to complement, and not replace, the annual Library Journal. The survey underlying the Library Journal article relies on information provided by the companies that offer library automation products and services. The survey that serves as the basis for this article collects data from the libraries themselves.

Survey Results

Statistics related to the question: How satisfied is the library with your current Integrated Library System (ILS)?

Satisfaction Score for ILS Response Distribution Statistics
ILS ProductResponses 0 1 2 3 4 5 6 7 8 9 ModeMeanMedianStd Dev
Apollo81 2 7 14 58 98.589 1.00
OPALS100 2 1 13 20 64 98.439 0.90
Koha -- Independent38 1 3 9 7 18 97.878 1.46
Koha -- ByWater Solutions37 1 2 12 8 14 97.868 1.15
Polaris101 1 3 2 1 3 17 42 32 87.778 0.70
Spydus23 1 1 3 4 5 9 97.658 1.04
VERSO72 1 1 3 7 27 17 16 77.407 1.06
Millennium388 1 2 4 8 12 21 40 124 121 55 77.117 0.46
Library.Solution104 1 7 6 6 7 21 31 25 87.118 0.78
Atriuum23 1 1 1 2 5 9 4 87.098 1.67
Koha -- LibLime31 2 1 2 4 10 8 4 76.907 0.72
Circulation Plus23 1 2 3 1 8 3 5 76.837 1.67
Evergreen46 1 2 8 4 16 7 8 76.837 1.03
Destiny23 1 1 1 1 2 2 1 12 2 86.658 0.63
ALEPH 500105 1 1 7 3 12 18 39 20 4 76.417 0.88
Symphony (Unicorn)271 3 5 9 12 19 37 41 77 50 18 76.157 0.24
Voyager123 4 3 10 25 25 42 13 1 76.016 0.54
Horizon179 1 1 9 9 20 24 31 44 27 13 75.996 0.15
Winnebago Spectrum31 3 4 1 2 3 11 5 2 75.457 1.44
All Responses2102 17 13 45 71 96 190 235 557 488 390 76.847 0.20



Statistics related to the question: How satisfied is the library overall with the company from which you purchased your current ILS?

Satisfaction Score for Company Response Distribution Statistics
ILS ProductResponses 0 1 2 3 4 5 6 7 8 9 ModeMeanMedianStd Dev
Apollo81 1 1 3 10 66 98.729 1.00
OPALS100 3 7 14 76 98.639 0.90
Koha -- ByWater Solutions37 4 14 19 98.419 1.32
Polaris100 2 2 6 3 14 32 41 97.838 0.80
VERSO72 5 4 19 16 28 97.818 1.06
Spydus23 1 1 5 2 4 10 97.578 1.04
Koha -- Independent32 2 3 4 1 6 16 97.449 1.59
Atriuum23 1 1 1 5 8 7 87.438 1.67
Library.Solution103 1 4 1 4 9 4 21 29 30 97.248 0.69
Circulation Plus22 1 2 4 6 2 7 97.237 1.49
Millennium387 3 4 13 8 19 30 61 106 99 44 76.667 0.41
Evergreen45 2 1 2 2 2 7 12 8 9 76.587 1.04
Destiny23 1 3 1 2 6 6 4 76.357 0.42
ALEPH 500104 1 4 6 10 18 14 31 16 4 75.976 0.78
Voyager123 1 5 5 11 18 30 39 13 1 75.906 0.63
Koha -- LibLime30 2 3 2 8 2 5 3 5 55.906 0.73
Symphony (Unicorn)271 8 8 17 11 26 37 54 57 39 14 75.636 0.24
Horizon179 4 3 19 15 18 29 22 43 19 7 75.316 0.30
Winnebago Spectrum30 6 2 3 1 3 1 2 3 5 4 04.575 1.46
All Responses2082 36 29 81 66 119 208 249 447 405 442 76.607 0.20



Statistics related to the question: How satisfied is this library with this company's customer support services?

Satisfaction Score for ILS Support Response Distribution Statistics
ILS ProductResponses 0 1 2 3 4 5 6 7 8 9 ModeMeanMedianStd Dev
OPALS99 1 5 11 82 98.769 0.90
Apollo81 2 1 9 69 98.759 1.00
Koha -- ByWater Solutions36 3 14 19 98.449 1.33
VERSO72 1 2 3 19 12 35 97.998 1.06
Polaris101 1 4 1 4 3 18 30 40 97.748 0.70
Atriuum23 1 1 1 4 7 9 97.618 1.88
Circulation Plus23 1 3 1 6 3 9 97.488 1.88
Spydus23 1 2 3 4 4 9 97.438 1.04
Koha -- Independent34 2 4 4 1 7 16 97.388 1.37
Library.Solution104 2 2 2 7 2 11 18 19 41 97.348 0.59
Millennium386 1 4 5 18 21 30 61 106 95 45 76.677 0.41
Destiny23 1 1 1 2 1 1 5 8 3 86.577 0.83
Evergreen45 2 1 2 3 1 5 3 11 8 9 76.297 1.04
ALEPH 500104 2 7 5 7 14 21 20 25 3 85.966 0.78
Voyager120 4 4 4 11 21 32 23 20 1 65.796 0.55
Horizon179 4 4 14 9 23 17 23 45 24 16 75.736 0.30
Symphony (Unicorn)269 8 7 15 20 23 32 50 56 39 19 75.676 0.18
Koha -- LibLime31 1 1 1 3 3 4 5 5 6 2 85.656 0.72
Winnebago Spectrum30 8 2 1 1 2 2 4 6 4 04.736 1.46
All Responses2079 38 32 60 82 125 180 253 408 414 487 96.677 0.20



Statistics related to the question: How likely is it that this library will purchase its next ILS from this company?

Loyalty to Company Score Response Distribution Statistics
ILS ProductResponses 0 1 2 3 4 5 6 7 8 9 ModeMeanMedianStd Dev
Koha -- ByWater Solutions37 2 6 29 98.739 1.48
OPALS99 1 1 1 3 8 85 98.719 0.90
Apollo81 1 1 3 8 68 98.709 1.00
VERSO72 3 5 2 5 24 33 97.968 1.06
Polaris100 1 1 1 2 4 6 8 25 52 97.929 0.90
Spydus23 1 1 2 4 4 11 97.708 1.04
Koha -- Independent34 2 2 1 1 4 4 20 97.569 1.54
Atriuum23 1 2 1 2 10 7 87.358 1.67
Library.Solution104 6 3 8 1 1 7 3 12 16 47 96.878 0.69
Evergreen44 2 1 2 2 6 2 6 9 14 96.778 0.90
Circulation Plus22 1 4 3 2 2 5 5 86.457 1.28
Millennium383 19 11 11 13 17 40 39 75 69 89 96.407 0.46
Destiny23 3 1 1 1 2 3 8 4 86.098 0.42
ALEPH 500103 6 4 5 6 17 9 22 15 19 75.987 0.79
Voyager121 2 3 4 5 13 27 15 28 17 7 75.776 0.45
Symphony (Unicorn)270 18 15 10 24 29 34 40 40 32 28 65.266 0.24
Koha -- LibLime31 6 1 3 5 3 4 4 5 05.236 0.00
Horizon179 21 9 9 11 20 21 24 31 19 14 74.945 0.37
Winnebago Spectrum30 10 2 1 2 1 2 3 6 3 04.004 1.46
All Responses2075 128 56 61 86 112 208 173 300 339 612 96.407 0.20



Statistics related to the question: Has the customer support for your ILS gotten better or gotten worse in the last year?

Change in customer support quality Response Distribution Statistics
CompanyResponses 0 1 2 3 4 5 6 7 8 9 ModeMeanMedianStd Dev
Koha -- ByWater Solutions37 1 3 10 23 98.469 1.15
OPALS96 1 9 2 3 17 64 98.279 0.92
Apollo80 7 1 11 11 50 98.209 1.01
Koha -- Independent34 2 1 2 4 3 4 18 97.449 1.54
Spydus23 1 2 1 4 3 3 9 97.228 0.83
Polaris100 1 1 2 3 17 11 6 33 26 87.118 0.80
VERSO71 16 15 4 5 8 23 96.617 0.95
Atriuum21 1 8 1 1 5 5 56.577 1.53
Library.Solution100 4 1 5 3 4 24 10 11 22 16 56.136 0.50
Destiny23 2 2 1 7 1 1 6 3 55.965 0.83
Evergreen45 3 1 1 8 8 1 9 9 5 75.827 0.89
Circulation Plus22 1 1 3 6 1 6 4 55.776 1.49
Millennium381 4 4 6 15 39 146 38 59 40 30 55.725 0.26
Koha -- LibLime31 2 1 1 3 10 3 5 1 5 55.555 0.90
Voyager118 3 3 6 11 52 14 18 11 55.335 0.46
ALEPH 500103 6 1 3 14 43 12 6 9 9 55.335 0.79
Symphony (Unicorn)271 11 7 16 28 21 87 26 26 26 23 55.155 0.30
Horizon179 10 8 12 10 25 55 17 23 12 7 54.805 0.30
Winnebago Spectrum27 9 1 1 1 7 2 3 1 2 03.745 0.96
All Responses2060 68 32 56 83 179 601 177 237 258 369 55.936 0.18



Statistics related to the question: How likely is it that this library would consider implementing an open source ILS?

Interest Level in Open Source Response Distribution Statistics
ILS ProductResponses 0 1 2 3 4 5 6 7 8 9 ModeMeanMedianStd Dev
Koha -- ByWater Solutions37 37 99.009 1.48
Koha -- Independent34 1 1 1 31 98.689 1.54
Koha -- LibLime28 1 1 2 24 98.509 1.70
OPALS98 5 1 2 2 2 86 98.329 0.91
Evergreen42 1 1 1 2 37 98.319 0.77
Horizon176 31 15 15 8 17 19 16 14 14 27 04.445 0.68
Voyager120 11 16 16 14 9 18 9 8 9 10 54.074 0.46
Symphony (Unicorn)269 64 26 30 17 19 31 29 19 14 20 03.593 0.43
Millennium383 86 35 53 33 27 48 23 28 19 31 03.533 0.15
ALEPH 500103 16 18 10 10 10 17 8 6 5 3 13.403 0.10
Winnebago Spectrum28 7 3 3 4 3 2 2 4 03.253 1.13
Destiny23 6 5 3 1 1 3 1 3 03.042 1.88
Circulation Plus22 8 2 2 2 6 2 02.822 0.00
Library.Solution103 35 9 15 10 3 14 4 4 1 8 02.752 0.49
VERSO71 20 7 20 6 2 11 4 1 02.242 0.24
Apollo78 34 8 7 7 8 9 1 4 02.081 0.00
Polaris100 41 13 17 6 6 5 5 4 1 2 01.981 0.70
Atriuum22 10 4 4 1 2 1 01.681 1.07
Spydus23 12 3 2 3 2 1 01.300 0.63
All Responses2059 447 198 217 153 129 216 124 104 87 384 04.044 0.20


An interactive version of the statistical reports is available here, which includes the ability to view the responses for each of the ILS products, along with the redacted comments.


ILS Turnover Reports

Another set of reports provide information on the ILS products that were selected during 2010 by libraries registered in lib-web-cats. [Note: these numbers are not comprehensive.]

The ILS Turn-over report counts and lists the automation systems recorded as selected or installed in 2010 with a breakdown of the previous systems displaced.

The Reverse ILS Turn-over report. counts and lists the automation systems recorded as replaced in 2010 with a breakdown of the new systems that were selected

General Observations

This year top marks cannot be explained by the profile of library served, other than the general tendency for the companies that serve larger and more complex libraries fall into a middle or lower tier of rankings. The products earning top marks, Apollo, OPALS, and Koha (when supported by ByWater Solutions) differ in many ways. Biblionix offers Apollo as a hosted service and is proprietary and serves small public libraries; OPALS, is an open source ILS developed for K-12 school libraries and has been adopted by some church and synagogue libraries; ByWater Solutions provides support for the open source Koha ILS to mostly public libraries, through some academics responded as well. It's clear that within this top tier of rankings that the libraries responding perceive their automation system as well-suited to their needs, have received excellent support, and have a high regard for the company supplying the system.

Perceptions Open Source ILS Products and Support Companies

The scores in the “Interest Level in Open Source” naturally run high for those libraries already involved with an open source ILS, ranging from a perfect 9.0 given by those running Koha supported by ByWater Solutions, through 8.6 for Evergreen. For those running a proprietary ILS, the interest in open source ILS seems mostly indirectly proportional to satisfaction with the ILS, company, and support. Libraries running proprietary products that rate high satisfaction with the ILS, company, and support categories selected lower levels of interest in open source alternatives, while those more dissatisfied show at least some higher interest. The scale of interest in open source from those running proprietary systems tops out at 4.4 (Horizon) compared to scores greater than 8.26 from existing open source practitioners.

Tracking the performance of the Koha open source ILS is a bit more complicated since libraries work with any of a number of firms that provide support services and some operate it on their own, independent of any commercial organization. Some of the support firms represented in the survey include LibLime, PTFS, which now owns LibLime, PTFS Europe (independent, but partially owned by PTFS) ByWater Solutions, CALYX, Catalyst, Libriotech, and Prosentient Systems. Koha, when supported without the support of a commercial support firm (Koha – Independent) received the highest rankings for satisfaction with the ILS, but received somewhat lower scores for support and company satisfaction. In these cases, the libraries were essentially rating themselves. Koha when supported by ByWater Solutions received exemplary perception rankings for company satisfaction (8.41) and support (8.44). ByWater Solutions attracted the highest score in responses to the company loyalty (8.73) question.

OPALS, an open source ILS developed and is supported by MediaFlex used primarily in K-12 School libraries and in church and synagogue libraries received exceptionally high rankings in all of the survey categories. For many of the libraries responding to the survey, support and hosting is provided by the BOCES School Library Systems in the state of New York. Libraries using OPALS rated the customer support at the top of this category (8.76) and gave it second highest rankings in ILS satisfaction (8.43), company satisfaction (8.63), and company loyalty (8.71). The number of responses from OPALS libraries was dramatically higher this year (100) compared to last year (41), or 2008 (5 responses). The comments written by libraries using OPALS overwhelming lent praise to the product and its support.

Comments on the Products and Companies specializing in the Automation of Smaller Libraries

Apollo, a hosted ILS provided by Biblionix used by small public libraries, received top rankings for ILS satisfaction (8.58) and company satisfaction (8.72); second best for ILS support (8.75); and third for company loyalty (8.70). The number of responses for Apollo has increased over the four years of the survey: 81 in 2010, 35 in 2009, 7 in 2008, and 7 in 2007. None library using Apollo indicated consideration of moving to a new ILS, none plan for a discovery interface, and 78 out of 84 report on-schedule installation.

VERSO from Auto-Graphics received very positive ratings in each of the categories. In the first three years of the survey, ratings for VERSO showed gradual improvement; this year rating in each category dipped a notch. Perceptions for support ranked fourth (7.99), third for loyalty to company (7.96), fifth for company satisfaction (7.8), and seventh for ILS satisfaction (7.4). Libraries using VERSO ranked very low in interest to an open source ILS (2.24). Overall survey results confirm very high perceptions for VERSO as an ILS and Auto-Graphs as a company and comments offered reflect high praise and flag no significant problems.

Comments on the Products and Companies specializing in the Automation of Larger Libraries

Polaris attracted exceptional rankings from the 101 customer libraries that responded to the survey and the comments offered were overwhelmingly positive. In 2007, the first year of this survey, Polaris was the top performer in all categories. This year Polaris received almost exactly the same ratings for ILS satisfaction as it did in 2007, but placed fifth in the rankings. When looking at the four-year run of survey results, Polaris was rated very consistently in satisfaction perceptions each of the categories. Polaris rated fifth highest in ILS satisfaction and support satisfaction and company loyalty and placed fourth for company satisfaction. The survey results do not indicate any slippage in the positive perceptions of Polaris by the libraries that use its product; its lower placement in the ranks has more to do with the large numbers of responses from highly satisfied libraries of other products, mostly from the smaller-library arena.

Millennium from Innovative Interfaces, Inc. received quite respectable rankings, generally placing near the top of the middle tier of most categories. Libraries running Millennium responded to the survey in higher numbers than any other ILS (383). When considering the products that serve larger libraries or consortia, Millennium placed lower than Polaris in ILS satisfaction (7.11) and company satisfaction (6.66), but higher than products such as Evergreen, Symphony, Aleph, Voyager, or Horizon. Company loyalty rankings were slightly lower (6.40). The statistic that stands out the most for Millennium reflects the number of libraries indicating interest in migrating to a new ILS. This percentage has increased steadily since 2007 from 6.69% to 8.28% to 11.71% to 18.73% in 2010. The comments generally had a negative tone, many of which acknowledged the strengths of the system while complaining about the cost or lack of openness.

Libraries using Library.Solution from The Library Corporation responded with rankings that generally fall around the top of the middle tier. In the ILS Satisfaction category, the 7.09 mean rating fell just below Millennium; but in company satisfaction (7.23) and support satisfaction (7.32) scores ranked higher than Millennium. Support satisfaction for Library.Solution has improved each of the four years of the survey. Libraries gave average company loyalty ratings of 6.84 and 14.2% indicate interest in migrating to a new ILS.

SirsiDynix offers two ILS products represented in the survey, Symphony, (282 responses) and Horizon (185 responses). The two ILS products received remarkably similar scores. Though SirsiDynix promotes Symphony as its flagship ILS, it offers continued support and development for Horizon. In the last year, SirsiDynix implemented a major reconfiguration in its customer support operations, centralizing into its facility in Provo, UT and deemphasizing support in its international offices. The survey specifically asks respondents to base their rankings on the experiences of the last year. Given that the reconfiguration was made in [], this year’s perceptions in support may be at least partially due this new arrangement. Some comments from international sites did reflect concern. Support scores for Symphony have improved over the last three years 4.91 in 2008 to 5.33 in 2009 to 5.66 in 2010, though still relatively low relative to products from other companies. Just over 20 percent of the libraries with Symphony reported consideration of a new ILS. More than half of Horizon (105 out of 185) sites show interest in moving to a new ILS, though 25 of these libraries include Symphony among the replacement candidates. When comparing this survey to last year’s rankings, satisfaction for the Horizon ILS has dropped slightly from 6.07 to 5.99, company satisfaction increased from 4.91 to 5.31, while support satisfaction held steady (2009 = 5.77; 2010 = 5.73). For Horizon libraries, company loyalty ranks low at 4.94. Though still a wide mark below those libraries already involved with open source ILS products, libraries using Horizon indicated a higher probability of shifting to an open source ILS than libraries using any other proprietary ILS (4.44). 34 percent of Horizon libraries and 28 percent of Symphony libraries indicate interest in implementing a new discovery interface.

Two products from Ex Libris were represented in the survey results, Aleph, (106 responses) and Voyager, (124 responses); the company continues support, marketing, and developmlent for both of these ILS products. Aleph and Voyager serve large and diverse organizations and offer very complex functionality, placing them into a tier of products that do not receive superlative marks. Aleph received a mean score of 6.41 on ILS satisfaction, ranking it 15th; Voyager’s 6.01 put it two places below that. Perceptions of the company between these two products came out about the same: Aleph (5.97) and Voyager (5.90), as was customer support: Aleph (5.96) and Voyager 5.79); both showed similar company loyalty: Aleph (5.98) and Voyager (5.77). In the area of interest in open source, Voyager libraries ranked much higher (4.07) than those with Aleph (3.4). One of the dynamics of interest with libraries using either of Ex Libris’ ILS products involves any indicators of interest in the company’s upcoming Alma platform, positioned as an eventual transition path for Aleph and Voyager. The modest loyalty rankings do not reflect overwhelming interest at this early date. A number [how many] libraries that flagged interest in migrating to a new ILS in the near future mentioned Alma (or URM as it was called during the survey period).


Details about The Survey

The survey instrument included five numerical ratings, three yes/no responses, and two short response fields, and a text field for general comments. The numeric rating fields allow responses from 0 through 9. Each scale was labeled to indicate the meaning of the numeric selection.

Four of the numeric questions probe at the level of satisfaction with and loyalty to the company or organization that provides its current automation system:

A yes/no question asks whether the library is considering migrating to a new ILS and a fill-in text field provides the opportunity to provide specific systems under consideration. Another yes/no question asks whether the automation system currently in use was installed on schedule.

view automation survey

Given the recent interest in new search interfaces, a yes/no question asks “Is the library currently considering a search interface for its collection that is separate from the ILS?” and a fill-in form to indicate products under consideration.

The survey includes two questions that aim to gauge interest in open source ILS, a numerical rating that asks “How likely is it that this library would consider implementing and open source ILS?” and a fill-in text field for indicating products under consideration.

The survey concludes with a text box inviting comments.

View the survey. (This version of the survey does not accept or record response data.)

In order to correlate the responses with particular automation systems and companies, the survey links to entries in the lib-web-cats directory of libraries. Each entry in lib-web-cats indicates the automation system currently in use as well as data on the type of library, location, collection size, and other factors that might be of potential interest. In order to fill out the survey, the responder had first to find their library in lib-web-cats and then press a button that launched the response form. Some potential respondents indicated that found this process complex.

The link between the lib-web-cats entry and the survey automatically populated fields for the library name and current automation system and provided access to other data elements about the library as needed. The report on survey response demographics, for example, relies on data from lib-web-cats.

A number of methods were used to solicit responses to the survey. E-mail messages were sent to library-oriented mailing lists such as WEB4LIB, PUBLIB, and NGC4LIB. Invitational messages were also sent to many lists for specific automation systems and companies. Where contact information was available in lib-web-cats, and automated script produced e-mail messages with a direct link to the survey response form for that library.

The survey attempted to limit responses to one per library. This restriction was imposed to attempt to sway the respondents to reflect the broad perceptions of their institution rather than their personal opinions.

The survey instrument was created using the same infrastructure as the Library Technology Guides web site—a custom interface written in perl using MySQL to store the data, with ODBC as the connection layer. Access to the raw responses is controlled through a user name and password available only to the author. Scripts were written to provide public access to the survey in a way that does not expose individual responses.

In order to provide access to the comments without violating the stated agreement not to attribute individual responses to any given institution or individual, an addition field was created for “edited comments.” This field was manually populated with text selected from the “comments” text provided by the respondent. Any information that might identify the individual or library was edited out, with an ellipse indicating the removed text. Comments that only explained a response or described the circumstances of the library were not transferred to the Edited Comments field.

Statistics

To analyze the results, a few scripts were written to summarize, analyze, and present the responses.

In order to avoid making generalizations based on inadequate sample sizes, the processing scripts included a threshold variable that would only present results when the number of responses exceeded the specified value. The threshold was set to a value of 20.

For each of the survey questions that involve a numeric rating, a set of subroutines was created to calculate and display simple statistics.

The "survey-report-by-category.pl" script processes each of the numerical ratings, displaying each of the statistical components listed above for each product that received responses above the threshold value. This report provides a convenient way to compare the performance of each ILS product for the selected question. The report sorts the statistics for each product in descending order of the mean. The report categories available correspond to the survey questions with numerical scale responses.

The “survey-product-report.pl” script provides the results for each of the ILS products mentioned in the responses. This report also provides the statistical components for each of the numeric question. It also provides the percentage of yes responses to the two yes/no questions:

[The text of this section mostly replicates what appeared in the 2007 version of this article. For for both editions of the survey I followed the same methodology for collection and and statistical analysis.]


Caveat

As I noted with previous editions of the survey, one should not read too much into the survey results. Responders to the survey provide their subjective impressions to fairly general questions. Although the survey instructions encourage responders to consider the broader institutional perceptions, it’s usually the case that multiple opinions prevail within any given library. While I believe that this survey does provide useful information about the experiences of libraries with their current integrated library systems and the companies that provide support, it should not be used as a definitive assessment tool.