AN ASSESSMENT STRATEGY FOR THE UFV LIBRARY Faculty Sabbatical Leave Report for the Period January 1 – April 30, 2017 Colleen Bell, Information Literacy & Web Services Librarian University of the Fraser Valley Submitted July 2017 © 2017, The Author. All rights reserved. CONTENTS Executive Summary ............................................................................................................... 6 Project Scope and Timeline .................................................................................................... 7 Outcomes and Activities.................................................................................................................7 Discussion on Changes to Outcomes and Activities .........................................................................8 Timeline of Activities......................................................................................................................9 Introduction ........................................................................................................................ 11 Why Library Assessment? .................................................................................................... 12 Accountability.............................................................................................................................. 12 Benchmarking .............................................................................................................................. 13 Professional Values ...................................................................................................................... 13 Communicating Value .................................................................................................................. 14 Assessment in Academic Libraries........................................................................................ 15 Assessment Climate/Culture ........................................................................................................ 16 Culture of Assessment........................................................................................................................ 17 Core Competencies for Assessment................................................................................................... 17 Collections ................................................................................................................................... 18 Learning ...................................................................................................................................... 18 ACRL Standards .................................................................................................................................. 18 Institutional Outcomes....................................................................................................................... 19 ACRL Framework ................................................................................................................................ 19 Rubrics ................................................................................................................................................ 20 Methods ...................................................................................................................................... 20 Organizational Performance ......................................................................................................... 21 Services ....................................................................................................................................... 22 Space ........................................................................................................................................... 24 User Experience/Usability ............................................................................................................ 25 Value ........................................................................................................................................... 25 Environmental Scan ............................................................................................................. 26 The Institutional Context.............................................................................................................. 26 UFV Strategic Plan .............................................................................................................................. 27 Partnerships ....................................................................................................................................... 29 Site Visits ..................................................................................................................................... 31 UBC Okanagan .................................................................................................................................... 32 Thompson Rivers University............................................................................................................... 36 Capilano University ............................................................................................................................ 38 Sabbatical Report: An Assessment Strategy for the UFV Library 2 Assessment in the UFV Library ............................................................................................. 39 Annual Statistics .......................................................................................................................... 40 Library Instruction Statistics ......................................................................................................... 40 Reference Statistics...................................................................................................................... 40 LibQUAL+ Surveys ........................................................................................................................ 41 Value of the UFV Library .............................................................................................................. 41 Space Use Study........................................................................................................................... 41 Library 2025 ................................................................................................................................. 42 Library Services ................................................................................................................... 43 Library Collections ............................................................................................................... 43 Library Spaces ..................................................................................................................... 43 Abbotsford Library Renovations ................................................................................................... 43 Final Thoughts ............................................................................................................................. 44 The Data Audit .................................................................................................................... 44 Process ........................................................................................................................................ 45 Findings ....................................................................................................................................... 46 Library Metrics: Inputs, Outputs, and Outcomes .................................................................. 47 Exploring Assessment Tools and Systems ............................................................................. 49 BlueCloud Analytics ..................................................................................................................... 50 Delivered Reports ............................................................................................................................... 51 Report Builder .................................................................................................................................... 52 Dashboards......................................................................................................................................... 54 Conclusions ........................................................................................................................................ 55 Tableau ....................................................................................................................................... 55 Space Use Study ................................................................................................................................. 56 LibQUAL+ Analysis .............................................................................................................................. 64 Departmental Allocations Dashboard ................................................................................................ 68 EZproxy Data ...................................................................................................................................... 70 Conclusions ........................................................................................................................................ 72 Dedoose ...................................................................................................................................... 73 LibQUAL+ Survey Comments.............................................................................................................. 73 Conclusions ........................................................................................................................................ 80 LibInsight ..................................................................................................................................... 80 Public Services Statistics..................................................................................................................... 82 E-journal Usage .................................................................................................................................. 84 Conclusions ........................................................................................................................................ 90 Final Thoughts ............................................................................................................................. 91 A Proposed Assessment Strategy ......................................................................................... 91 Sabbatical Report: An Assessment Strategy for the UFV Library 3 References .......................................................................................................................... 94 Appendices ....................................................................................................................... 101 Appendix A: Library Strategies for the Education Plan ................................................................. 102 Appendix B: Site Visit Interview Questions ................................................................................. 106 Appendix C: Value of the UFV Library, 2010-2011........................................................................ 107 Library Value Calculations ................................................................................................................ 107 Appendix D: Data Inventory ....................................................................................................... 109 Wiki................................................................................................................................................... 109 Shared Network Drive ...................................................................................................................... 110 Appendix E: UFV Library Metrics................................................................................................. 118 Appendix F: Tableau Community Forums .................................................................................... 141 Thread 1: Displaying Date Parts as Text Names ............................................................................... 141 Thread 2: Using a parameter to select multiple dimensions ........................................................... 144 Thread 3: No heatmap data showing on polygons/background image ........................................... 145 Appendix G: BC Library Conference Panel ................................................................................... 148 Details............................................................................................................................................... 148 My Slides and Notes ......................................................................................................................... 149 Handout............................................................................................................................................ 162 LIST OF FIGURES 1. Google Ngram of the word "accountability" .......................................................................... 12 2. Thematic analysisis of papers presented at the Library Assessment Conference ................. 16 3. Library Dashboard from Capilano University.......................................................................... 39 4. BlueCloud Analytics: Delivered reports from the Catalog-Item folder .................................. 51 5. BlueCloud Analytics report builder ......................................................................................... 52 6. Simple report in BlueCloud Analytics ..................................................................................... 53 7. Using filters and calculated metrics in the BlueCloud Analytics report builder ..................... 53 8. Report prompts in BlueCloud Analytics .................................................................................. 54 9. BlueCloud Analytics report with prompted filters applied. .................................................... 54 10. BlueCloud Analytics delivered dashboard: Turnover by date range ...................................... 55 11. Heat map in Tableau, showing use of parameters ................................................................. 57 12. Asking a question in the Tableau Community forums ............................................................ 58 13. Tableau: Heat map data mapped onto library floor plans ..................................................... 59 14. Bubble chart in Tableau .......................................................................................................... 60 15. Tableau: Comparing data using bubble charts does not work ............................................... 61 16. Tableau failure: using a single parameter to control two values ........................................... 62 17. Results from a Tableau failure ................................................................................................ 62 Sabbatical Report: An Assessment Strategy for the UFV Library 4 18. Tableau can’t always tell the real story .................................................................................. 63 19. Excerpt from LibQUAL+ survey raw data file – 218 columns of data ..................................... 64 20. LibQUAL+ data cleaned and ready for Tableau – only 16 columns of data............................ 65 21. Fooling Tableau into creating a zone of tolerance chart ........................................................ 66 22. LibQUAL+ survey dashboard in Tableau ................................................................................. 67 23. Departmental allocations dashboard in Tableau.................................................................... 69 24. Tableau: EZproxy use by IP range and hour of day for March 29, 2017................................. 71 25. Dedoose can accept data from a variety of sources .............................................................. 74 26. Dedoose project home screen ................................................................................................ 75 27. Code structure for LibQUAL+ comments in Dedoose ............................................................. 76 28. Descriptor x descriptor x code chart in Dedoose ................................................................... 77 29. Code frequency x descriptor bubble chart in Dedoose .......................................................... 78 30. Descriptor x code count table in Dedoose.............................................................................. 78 31. Packed code cloud in Dedoose ............................................................................................... 79 32. Code co-occurrence table in Dedoose .................................................................................... 80 33. LibInsight home page .............................................................................................................. 81 34. Public services statistics in its original format ........................................................................ 82 35. Public services data formatted for LibInsight upload ............................................................. 83 36. Editing the dataset fields in LibInsight .................................................................................... 83 37. Pre-defined entries in LibInsight make data entry easier....................................................... 84 38. Analyzing database and journal cost and use data in LibInsight ............................................ 85 39. Comparing journal and database data by platform in LibInsight ........................................... 85 40. Monthly journal and database use over time in LibInsight .................................................... 86 41. Yearly journal and database use over time in LibInsight ........................................................ 86 42. Year-over-year trends on database and journal use in LibInsight .......................................... 87 43. Summary of duplicate journal titles across platforms in LibInsight ....................................... 87 44. List of duplicate titles (all platforms) in LibInsight .................................................................. 88 45. LibInsight: List of SpringerLink journals with number of downloads ..................................... 88 46. LibInsight: Journals with the highest number of downloads (all platforms) .......................... 88 47. LibInsight: ACS Journals with zero use in a two-year period .................................................. 89 48. LibInsight: Database usage for Ebscohost .............................................................................. 89 49. Journal usage dashboard in LibInsight .................................................................................... 90 Sabbatical Report: An Assessment Strategy for the UFV Library 5 EXECUTIVE SUMMARY This project advocates the development of library assessment as a key function of the UFV Library. The underlying premise of this project is that assessment in libraries is a good and necessary thing. Four primary arguments - accountability, benchmarking, professional values, and, perhaps the most compelling, communicating value – outline the rationale behind this assertion. Assessment in academic libraries has matured to the point where it has become an area of specialization; libraries have developed programs of library assessment and position titles of the librarians and staff engaged primarily in assessment work reflect that emphasis. The literature on assessment is rich and can be categorized across a number of themes: assessment climate and culture; collections; student learning; methods; organizational performance; services; space; user experience and usability; and value. UFV’s strategic framework offers additional context for an assessment program within its library. Site visits to UBC Okanagan, Thompson Rivers University, and (virtually) Capilano University offer insight into how we might develop such a program. A survey of assessment activities in the UFV Library rounds out the environmental scan. There are a range of technologies, tools, and systems that support assessment work; this project explored four of them, largely because they were readily available: BlueCloud Analytics, a product provided by SirsiDynix and designed to analyze data from our integrated library system; Tableau, data visualization software; Dedoose, web-based software for analyzing qualitative data; and LibInsight, a system for consolidating and analyzing library data. Finally, this project offers a plan of action for establishing a program of library assessment as we move forward with strategic planning. Keywords: library assessment; academic library value; library impact; outcomes; library collections; library services; library operations; library facilities; assessment strategy; strategic planning; data analytics; data audit; environmental scan; library metrics Sabbatical Report: An Assessment Strategy for the UFV Library 6 PROJECT SCOPE AND TIMELINE Outcomes and Activities The following is taken directly from my sabbatical proposal, submitted in November 2015. Changes to the proposed activities and outcomes are offered as discussion below. 1. Literature review In recent years there has been an explosion in the literature around assessment in academic libraries, both in the journal literature and from professional associations. Journals with a significant focus on assessment in academic libraries include: portal: Libraries and the Academy; The Journal of Academic Librarianship; College & Research Libraries; Performance Measurement and Metrics; and Evidence Based Library and Information Practice. The Value of Academic Libraries (Oakleaf, 2010) report serves as a useful starting point, as does the Association of Research Libraries’ LibValue project (http://www.libvalue.org/). Both focus on qualitative and quantitative measures of value for academic libraries and Oakleaf (2010) offers a particularly useful review of the literature up to the point of publication. 2. Site visits A number of relatively local institutions have dedicated assessment librarians who have managed to develop robust assessment programs. Visits to these sites will allow me to interview the assessment librarians and learn more about their assessment activities. Proposed sites include: University of Washington (Steve Hiller, Director of Assessment and Planning); University of British Columbia (Jeremy Buhler, Assessment Librarian); Vancouver Island University (Kathleen Reed, Assessment and Data Librarian); and University of British Columbia Okanagan (Laura Thorne, Communications, Marketing, and Assessment Librarian).1 3. Data inventory Identifying what data the UFV Library already collects, where and how it is stored, who 1 These institutions have been selected because of their proximity and the identification of assessment in the individuals’ titles. While two of them – UBC and University of Washington – are research institutions and unlike UFV, their libraries do have well-developed assessment programs that conduct assessment related to undergraduate education. The remaining two institutions are most closely related to UFV – both because they are former university colleges (like UFV), but also because librarians there have multiple roles beyond those reflected in their position title. Sabbatical Report: An Assessment Strategy for the UFV Library 7 4. 5. 6. 7. 8. is responsible for collecting and maintaining it, and how it is used will be an important first step to developing an assessment plan. Data resources inventory Knowing what systems and tools we have available to us for data collection, storage, and analysis, as well as who has expertise in these systems and tools and could serve as a resource, provides an important base for an assessment plan. Professional development It seems that every week I come across an announcement for a webinar or workshop with some relationship to library assessment, from organizations such as the Association of College & Research Libraries, Association of Research Libraries, British Columbia Library Association, and OCLC. I plan to review the archives of those I’ve collected, as well as take opportunities to participate in any that come up during my sabbatical. These webinars have kept and will continue to keep me informed about assessment practices and projects in other libraries, or interesting library activities that can inform or lead to assessment opportunities. Research and testing of new systems and tools for assessment Identification and testing of new metrics and methodologies Developing conference presentation(s) &/or article(s) for publication (if there’s time) Discussion on Changes to Outcomes and Activities Draft Assessment Plan In my sabbatical proposal, I wrote: I propose to develop a draft assessment plan for the UFV Library that will focus on the library’s services, operations, facilities, and collections. The plan will be presented to library faculty and staff for review, comment, revision, and adoption following my sabbatical. An assessment plan is most often a companion to a strategic plan, intended to answer the question, “How will we know that we have achieved our strategic objectives?” It identifies the metrics that will be used to measure the success of a strategic plan’s initiatives and objectives. It can be developed as a separate document, but in the best circumstances it is developed simultaneously with the strategic plan and integrated into the plan. When I first discussed my proposal with the University Librarian, the timeline seemed doable. There was momentum behind the development of the Library’s strategic plan, and we felt that the strategic plan, if not completed at the point I began my sabbatical, would at least be in development and at a point where it made sense to be developing the assessment piece. This, for various reasons, did not happen, and so my objective of having a draft plan for review by my colleagues has not been met. I have chosen, instead, to propose an assessment strategy – a Sabbatical Report: An Assessment Strategy for the UFV Library 8 framework that can put in place essential building blocks for an assessment plan and help guide the development of this plan alongside the development of a strategic plan for the library. Because this project is no longer about drafting as assessment plan, a robust and focused discussion on assessment of services, facilities and operations is somewhat premature. Literature Review Initially, I identified the primary focus of my project as library services, facilities, and operations. However, when I attended the Library Assessment Conference in November 2016, I observed a number of changes from previous years when I had attended the conference, and I grew curious about this. So instead, I have focused my literature review by analyzing the themes represented by the papers presented at this conference since its inception in 2006. This has, then, changed the focus of my investigation of the literature. I have also broadened my review to include not just the published literature, but also take into account research and other projects that have helped inform the assessment community. Site Visits An unusually robust and prolonged winter made it very challenging to travel, and I was able to accomplish only two site visits toward the end of my sabbatical leave. However, an unexpected opportunity presented itself in late April and I was able to attend a presentation by Tania Alekson of Capilano University. While not a site visit, it was an opportunity to hear about the range of assessment activities in the Capilano University Library, which, like UFV, does not have a dedicated assessment position. Professional Development In addition to reviewing a few of the webinars I had saved, I focused significant resources on learning new software, failing to acknowledge in my original proposal that this in itself would be a significant process of professional development. Timeline of Activities January    Began literature review Began data audit Explored BlueCloud Analytics - Completed self-paced tutorial - Developed reports based on existing statistics discovered during data audit (e.g., items by location, type; Bonn’s use factor; circulation by call number range) Sabbatical Report: An Assessment Strategy for the UFV Library 9 February    March       April       Completed literature review Completed data audit Began exploring Tableau - Viewed video tutorials - Began re-analysis of space use data - Began analysis of LibQUAL+ data 2005-13 Continued exploring Tableau - Completed re-analysis of space use data - Completed analysis of LibQUAL+ data from 2005-13 - Created selector dashboard, based on data from Sirsi (combining acquisitions and circulation data) – something we hadn’t been able to do previously Explored Dedoose - Began analyzing LibQUAL+ comments from 2005-13 (incomplete, ran out of time) Met with Vlad Dvoracek, Institutional Research & Planning - Exploring access to student data (for correlational studies) Met with Darrin Lee, Bryan Wilkinson & Bryan Daniel, Information Technology Services - Exploring existing/potential data related to technology use in the libraries Organized site visits Set up EZproxy logs Conducted site visits: Thompson Rivers University & UBC Okanagan Explored LibInsight - Viewed video tutorials - Explored e-journal usage with four vendors supporting SUSHI and COUNTER (JR1 reports) - Explored data sets pulled in from other systems testing using existing data sets and data pulled in from other systems, e.g. COUNTER reports, Google Analytics) Continued exploring Tableau - Began analyzing EZproxy data (incomplete) Prepared & delivered BC Library Conference presentation Drafted assessment framework and recommendations Began drafting sabbatical report (interrupted by return to work) Sabbatical Report: An Assessment Strategy for the UFV Library 10 INTRODUCTION It seems like I’ve been building toward this project forever. For more than 20 years as an academic librarian, I have been responding to assertions, assumptions and proposals with questions such as, “Yes, but how do we know that’s true?” and making suggestions: “Why don’t we just ask the students?” I’ve invested heavily in professional development: I’ve attended conferences and workshops; I’ve read hundreds of articles; and I’ve followed projects such as LibValue and ACRL’s Value of Academic Libraries. In 2012, in a meeting with the University Librarian, I expressed a desire to take a more central role in assessment. I told her that I didn’t expect a response immediately, but that I hoped she would think about it. In 2013, I was responsible for administration of the LibQUAL+ survey. In 2014, I designed a study to measure student perceptions of the value of library instruction; the study yielded no usable data, but it did teach me something about research design. That same year, I designed a space use study, and the data collection was carried out over three semesters in 2014 and 2015, and the data analyzed in 2015-16. But these projects were all carried out “off the side of my desk.” I felt optimistic that we were moving in a good direction, but I was frustrated that we were making no progress toward creating a formal role for assessment in the library, and I continued to receive resistance to more rigorous assessment in the service of informed decision-making. So in 2015, as I was preparing my sabbatical proposal, I decided I needed to turn up the volume. Henry Rollins, musician, writer, actor and activist once stated, “My optimism wears heavy boots and is loud” (Gabriella, 1998). My sabbatical proposal was my version of heavy boots.2 This report represents an even greater increase in the volume of my request, but also offers a plan for doing so3. As I told the University Librarian, when asked about my expectations upon my return from sabbatical, “I’m not expecting to have a new job title on May 1, but I do expect to see some movement toward it by September 1.” This report offers an argument in support of library assessment, explores the environmental context for assessment, describes past assessment activities within the UFV Library, and 2 With the support of both the University Librarian and several of my colleagues, thankfully. 3 It’s also important to note that progress is being made, as we are engaged in a review of librarians’ portfolios; the intent of this review is to identify opportunities for each librarian to explore new roles in the library, as well as what those roles should be. Sabbatical Report: An Assessment Strategy for the UFV Library 11 explores tools and systems to support assessment. Finally, it offers concrete strategies for moving forward with a program of library assessment in the UFV Library. WHY LIBRARY ASSESSMENT? The underlying premise for this project is that assessment in libraries is a good and necessary thing. Here, I offer four perspectives or arguments in support of this claim: accountability, benchmarking, professional values, and communicating value. Accountability Pritchard (1996) wrote, Few libraries exist in a vacuum, accountable only to themselves. There is always a larger context for assessing library quality, that is, what and how well does the library contribute to achieving the overall goals of the parent constituencies. (p. 573) Assessment as accountability provides one argument for its undertaking. Institutions of higher education, particularly those funded publicly or accredited by external agencies, are called upon by funders and accreditors to provide evidence of their effectiveness (and, one might argue, efficiency). Google’s Ngram Viewer4 (Figure 1) demonstrates that use of the word “accountability” begins to increase sharply in the 1960s, coinciding with an increase in public investment in higher education in the United States (“History of higher education,” 2017). Figure 1. Google Ngram of the word "accountability" 4 Data for the Ngram Viewer is taken from Google Books (see https://books.google.com/ngrams) Sabbatical Report: An Assessment Strategy for the UFV Library 12 UFV, like other public, post-secondary institutions in British Columbia, submits an annual accountability report (see http://ufv.ca/irp/data-and-reports/annual-reports/ for examples) to the Ministry of Advanced Education that details our institution’s goals, outcomes, and finances. As part of UFV, the library is included in that accountability, although the library’s role in helping UFV achieve its goals is almost non-evident. 5 Given this virtual absence of the library from the institution’s accountability narrative, then, perhaps it is not in itself a persuasive argument. Benchmarking The Oxford English Dictionary defines benchmarking as “the action or practice of comparing something to a benchmark; evaluation against an established standard.” Libraries have a long history of benchmarking themselves against other libraries. In British Columbia, the Council of Post-Secondary Library Directors has conducted an annual survey of library data since 2001, providing comparative data on staffing, budgets, collections, services, and facilities. Benchmarking data is generally quantitative, focusing on inputs (e.g., number of volumes held; number of student seats available) and outputs (e.g, number of reference questions asked; number of items circulated). Benchmark data can be useful in advocating for more resources from your administration, especially if it involves comparisons with a rival institution: “Did you know that Institution X has a student-to-librarian ratio of Y, and ours is Z? That’s a primary reason why we’ve been unable to do A, B, and C, like Institution X does.” It can also be useful in identifying libraries to talk to about service improvements: “In our last LibQUAL+ study, there were a lot of comments about lack of seating in the library; 70% of the other libraries our size have more student seating per square metre than we do – we should ask them how they did it.” Benchmarking data is limited in what it can tell us, however. It is generally limited to a set of metrics around activities that are common to the institutions involved, and therefore limited in scope. It’s useful as a comparison tool, but not necessarily helpful when looking at specific outcomes or ideas. Professional Values We can also find a compelling argument for library assessment through examination of our professional values: 5 In the 2016 accountability report, the library is mentioned three times in relation to contributions to the institution’s accountability. Sabbatical Report: An Assessment Strategy for the UFV Library 13 I have been heartened that, of late, we have seen a reawakening and rediscovery of our social purpose in academic libraries, a return to a progressive perspective. We are embracing that the value we create through libraries is not just economic but also— through thoughtfully developed and shared services and collections—inclusion, equity, and social justice; that we are pursuing the creation of value through our values. (Hinchliffe, 2016, p. 12) Assessment is the means by which we can communicate the value we are creating within our institutions. One example of this is around indigenization: the library profession’s values of social justice, equity, and inclusion speak directly to the institutional focus on respecting the original peoples on whose lands we live and work, as well as integrating their values and ways of knowing into our curriculum and practice. Communicating Value But perhaps the most compelling argument for library assessment is that of communicating a library’s value. There are several different ways of exploring a library’s value: use or utility (e.g., an efficiencybased perspective); return on investment, or ROI, which looks at value for money (see, for example, ); commodity production (e.g., quantity of commodity produced x price per unit of commodity); library impact; and competing alternatives (e.g., getting users to perceive the library as more valuable than competitors, such as Google). Of these, the first are considered internal or introspective lenses – they look at the value supplier (e.g., the library) first, and the perceived value (e.g., the perspective of the user) second, while the last two represent an external, or user-centred, focus (Oakleaf, 2010). When we talk about value in academic libraries, we most often talk about impact. Assessing library impact obliges us to ask questions that align directly with institutional outcomes and vision (Education Advisory Board, 2011):    What do the library’s collections, services, and facilities enable students, faculty, administrators, and staff to do? What can our library users tell us about how to make improvements to our collections, services, and facilities? What is the library’s impact on student success, learning, enrolment, retention, and graduation rates? Sabbatical Report: An Assessment Strategy for the UFV Library 14      Do the library’s collections connect to and enhance current course content, readings, and assignments? How does or can the library support faculty teaching? How does the library contribute to student and faculty experiences, attitudes, and perceptions of quality? How does the library contribute to faculty research quality and productivity? What is the library’s role in generating institutional income, such as faculty grant proposals and funding? The end result of focusing on value is that the library becomes more focused on the needs of our users, and in doing so, we demonstrate that the library’s activities are directly linked to the needs and values of the university, making it more likely that the university will recognize the library as a key partner in its core mission: educating students. Imagine being able to tell a powerful story about the many ways that the library impacts its users and the institution. We can’t do this with bare facts or dry numbers – we need the richness that comes from our users telling their stories about how the library has made an impact on their lives. This focus on value, then, requires a shift in how we in the library view each of our activities and how we determine the value we bring. In short, it requires a robust library assessment program, guided by a user-centered strategic plan. ASSESSMENT IN ACADEMIC LIBRARIES Assessment in academic libraries has matured to the point where it has become an area of specialization in many academic libraries. Evidence of this can be found by exploring the attendee rosters for the bi-annual Library Assessment conference. Established in 2006, this conference has become one of the largest gatherings of library assessment practitioners in North America. In 2008, the earliest year for which such data is available, 62 (16.08%) of the 374 conference attendees had position titles referencing assessment.6 In 2016, 192 (30.33%) of the 633 attendees held such position titles, representing a marked increase of identified assessment positions. 6 In addition to looking for the word “assessment” in position titles, I included titles making reference to assessment activities, such as user experience, organizational effectiveness, impact, strategy, and planning. Sabbatical Report: An Assessment Strategy for the UFV Library 15 A thematic analysis of papers presented at this conference between 2006 and 2016 (Figure 2) demonstrates how the practice of library assessment has developed over time. Figure 2. Thematic analysisis of papers presented at the Library Assessment Conference Most attendees at the conference are from academic libraries, and this analysis represents the work of those attendees and their libraries; it seems like a useful way to explore the current landscape of assessment in academic libraries. What follows is a brief discussion of each of these themes in the context of academic libraries, with a view toward illuminating assessment activity. Assessment Climate/Culture This theme is related to development and effectiveness of assessment programs and activities, assessment “readiness,” and creating a culture of assessment. The focus on library assessment as its own specialization began to appear in the literature around 1996, when Pritchard (1996) published a review of literature on outcomes assessment in libraries. It was followed by a white paper from the ACRL Task Force on Library Outcomes Assessment (1998), which sought to “develop a philosophical framework for assessing libraries in terms of desired campus outcomes [and] develop prototypes for such assessment” (para. 12). Sabbatical Report: An Assessment Strategy for the UFV Library 16 Culture of Assessment In 2004, Lakos and Phipps introduced the concept of a culture of assessment and examined the necessary ingredients for creating such a culture, which they define as an organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders. (p. 352) They go on to describe the organizational prerequisites for such a culture: staff engagement in understanding and responding to user expectations and needs; a performance- and learningoriented organizational context (specifically mission, values, structures, and systems); and openness, integrity, and trust. Farkas, Hinchliffe, and Houk (2015) identified factors that facilitated or hindered the development of a culture of assessment. They found that a culture of assessment was facilitated most by: clear expectations for library assessment; the systematic use of assessment data in decision-making; existence of an assessment plan; the use of assessment data by librarians in service of professional growth; and assessment as a priority for library administration. On the other end of the scale, a culture of assessment was hindered by: the lack of skills in-house to carry out assessment activities; lack of support for staff involved in assessment work; no shared understanding of the purpose of assessment; lack of access to systems or technology to support assessment work; and a library culture that is not usercentred. These findings are consistent with McAyeal (2014), who identified five mindsets of organizations that have achieved a culture of assessment: assessment as an ongoing, ubiquitous activity throughout the library; development of assessment skills throughout the organization; a willingness to change; alignment with institutional goals; and communication of assessment results beyond the library. Core Competencies for Assessment Dole (2013) explored the idea of identifying core competencies for library assessment and planning, and this was followed by Passoneau and Erickson’s (2014) analysis of more than 230 job postings over an 18-month period, which identified competencies in three categories:  management skills (e.g., program and project management; staff development and training; developing partnerships; and gathering, analyzing, communicating, and archiving data); Sabbatical Report: An Assessment Strategy for the UFV Library 17   individual skills, relating to competence with specific software or tools (both qualitative and quantitative), data analysis, communication, knowledge of methods, and analytical ability; and soft skills, such as collaboration, creativity, and innovation. The Association of College and Research Libraries (ACRL) followed this in 2017 with its own set of proficiencies for assessment librarians and coordinators, which identifies specific skills and abilities in eleven areas: knowledge of assessment; ethics; assessment methods and strategies; research design; data collection and analysis; communication and reporting; advocacy and marketing; collaboration and partnerships; leadership; management; and mentoring, training and coaching. An earlier statement from the Canadian Association of Research Libraries, or CARL (2010) identified assessment as a core competency for all librarians at CARL institutions. Collections This theme refers to the development, maintenance and use of library collections (both print and digital), including special collections, archives, institutional repositories, and digital collections. My colleague explored collection evaluation and assessment in her sabbatical leave report (Wilson, 2014), and it serves as a good overview of collections assessment in academic libraries. Learning This theme focuses primarily on student learning, and comprises a wide range of assessment activities, from in-class quizzes and micro-assessments to rubrics for evaluating student work to standardized tests of information literacy. ACRL Standards In 2000, ACRL adopted standards for information literacy designed to offer a comprehensive set of learning outcomes that could then be used to measure student learning from an information literacy perspective. Walsh (2009) explored the state of information literacy assessment through case studies in the literature, identifying nine different assessment tools in use (multiple choice questionnaire; analysis of bibliographies; quiz or test; portfolio; essay; observation; simulation; and final grades); the most popular of these, employed in more than one-third of the studies, was the multiple-choice questionnaire. Oakleaf (2009) proposed an information literacy instruction assessment cycle (ILIAC) that promotes an iterative assessment process and recognizes that much of a library’s information Sabbatical Report: An Assessment Strategy for the UFV Library 18 literacy activity is predicated on “one-shot” instruction, where the librarian meets with students in a class once, then might never see them again. The ILIAC model incorporates the impact of the librarian instructor, emphasizing assessment of learning as a means to improve his or her effectiveness. Institutional Outcomes In 2002, the Middle States Commission on Higher Education, an accrediting agency in the United States, took a leading role in redefining accreditation standards to focus on student learning outcomes (Middle States, 2006). It was not the first to identify information literacy as an outcome, but it was the first to require institutions to provide evidence, ensuring that both faculty and librarians would have to now consider the broader, institutional context of information literacy and work together to ensure that assessment was taking place (Thompson, 2002). This requirement to provide evidence of student attainment of information literacy parallels the development of the standardized test of information literacy. Kent State University in Ohio was one of the first to explore this. In 2001, they began development of a robust, rigorous assessment called SAILS – Standardized Assessment of Information Literacy Skills. They based their test on the ACRL standards (O’Connor, Radcliff, & Gedeon, 2002). There was intense interest from the library community, and in 2006, SAILS went international when it was released for administration at institutions in the United States and Canada. Others, such as the iSkills Assessment from Educational Testing Services (https://www.ets.org/iskills/about - now discontinued) and the Information Literacy Test from James Madison University (http://www.madisonassessment.com/assessment-testing/information-literacy-test) followed. ACRL Framework In 2016, ACRL adopted a new framework for information literacy, which no longer offers a prescriptive set of learning outcomes, but rather emphasizes knowledge practices and approaches. Librarians are left to develop learning outcomes on their own. Hovious (2015) proposed a way to address this by mapping the new framework to the old standards, providing a bridge between the two. Oakleaf (2014) advocates a backward design approach7 to working with the new framework – harkening back to the function of the original standards, but looking 7 Backward design, as defined by Wiggins and McTighe (2005), involves first identifying the results of learning (the outcomes), then considering the evidence that will demonstrate it (the assessment), and, finally, identifying the strategies that will get the student there (the learning activities). Sabbatical Report: An Assessment Strategy for the UFV Library 19 for different artifacts of learning. Rather than multiple-choice or scenario-based tests, she looks to performance assessments such as reflective writing, self or peer evaluations, research drafts, portfolios, presentations, and bibliographies, as more appropriate for assessment against the ACRL framework. Rubrics Rubrics as a tool for assessing information literacy have received a great deal of attention in the literature, and they can be designed as performance assessments. The VALUE rubrics include a rubric specifically for information literacy (AAC&U, 2009) that “articulates fundamental criteria for [information literacy], with performance descriptors demonstrating progressively more sophisticated levels of attainment” (para. 1). Adaptation of the rubrics for an individual institutional context is encouraged. Turbow and Evener (2016) describe the process of adapting and norming this rubric at their institution. Other examples of the development and implementation of rubrics abound. Gola, Ke, Creelman, and Vaillancourt (2014) describe the development and implementation of a rubric to assess papers submitted by graduating seniors. Daniels (2009) describes the development of a rubric to assess a specific information literacy outcome (students’ ability to critically examine the credibility of information sources in a first-year writing course). Jastram, Leebaw, and Tompkins (2014) explain how a rubric to assess information literacy has had an impact on how information literacy is seen across the institution. And Willson and Angell (2017) use a rubric to assess information literacy skills and nursing professional standards. Methods Library assessment practitioners employ a wide range of methodologies and approaches. Hernon, Dugan, and Nitecki (2011) have complied a list of many of the methodologies commonly employed in library assessment, including: Sabbatical Report: An Assessment Strategy for the UFV Library 20 Quantitative Qualitative Mixed Methods                   Citation analysis Mystery shopping Observation Standardized tests Surveys Sweeping study Transaction log analysis Usability testing Case study Ethnographic observation Interviews Mystery shopping Observation Surveys Sweeping study Usability testing Verbal protocols Content analysis Foster and Gibbons’ (2007) ground-breaking study on how students at one campus actually use the library’s services and space expanded the range of methods employed within the library assessment community. In addition to interviews and surveys, they employed ethnographic methods such as design workshops (of both library spaces and the web site), photo surveys, and mapping diaries. The ERIAL project (http://www.erialproject.org/) also employed ethnographic methods to explore student culture at five institutions. They developed a toolkit of methods that includes photo diaries, mapping diaries, research journals, participant observation, cognitive mapping, retrospective research interviews, and focus groups. Savage, Potrowski, and Massengale (2017) interviewed librarians at five different institutions of higher education to learn about methods used in library assessment of information literacy, user experience, and connecting to the larger community. Organizational Performance Organizational performance assessment explores topics such as climate, culture, and organizational or operational effectiveness. One tool in use by many libraries to measure organizational effectiveness is called the Balanced Scorecard (http://www.balancedscorecard.org/). The Balanced Scorecard is used by business, governments and non-profit organizations around the world and measures performance from four perspectives: financial, stakeholder (e.g., value, satisfaction), internal process (e.g., efficiency, quality), and organizational capacity (e.g., human capital, culture, infrastructure). In 2009, the Association of Research Libraries initiated a pilot project (http://www.arl.org/focusareas/statistics-assessment/balanced-scorecard) to assist member libraries in implementing the Balanced Scorecard. Taylor and Heath (2012) attribute the successful transformation of their library system to its adoption of the Balanced Scorecard, which promotes a process of continuous planning and assessment. A survey of libraries using the Balanced Scorecard (de la Sabbatical Report: An Assessment Strategy for the UFV Library 21 Mano & Creaser, 2016) revealed that while respondents found the implementation challenging, most of the libraries felt that they had received benefit from using it, citing positive outcomes related to strategy, operations, reporting and advocacy, and staff culture. MacDonald (2013) defines organizational climate as the policies, practices, and rewards that affect individuals within the organization. One instrument that has been developed to measure organizational climate in libraries is ClimateQUAL (https://www.climatequal.org/). Phipps, Franklin, and Sharma (2013) describe how one library used ClimateQUAL to identify areas that were problematic, then followed up with focus group interviews to learn more about each of these areas and identify potential solutions. Changes were indicated for five areas: leadership and decision-making; performance management; hiring, merit, and promotions; communications; and learning, training and innovation. Shepstone and Currie (2008) undertook a case study of organizational culture in order to systematically manage a shift from the current culture to one that was deemed preferred. They were able to identify specific competencies needed to make the shift and to arrive at a proposed action plan. Services This theme comprises most of the programs or activities of the library that enable a library’s users to make use of the library. One of the most prevalent instruments for measuring quality of service in libraries is LibQUAL+ (http://www.libqual.org/), which measures three dimensions of user satisfaction: customer service (“Affect of Service”), access to collections (“Information Control”), and facilities (“Library as Place”). Numerous studies have been published on assessment using LibQUAL+, and the UFV Library has administered the survey five times since 2005. One such study (Jackson, 2015) explored the relationship between LibQUAL+ ratings of service quality against university rankings. It found that some measures used in university rankings do correlate with LibQUAL+ scores, but that in general, the university rankings used a limited number of variables, which called into question their analysis of library quality. Stvilia and Gibradze (2017) surveyed students to determine which library services they valued most highly, as well as to find out which kinds of social media postings about those services they found most useful. The findings revealed that students found access to information and computer resources, as well as study support services (e.g., tutoring, stress management, and free snacks and water during high-stess times), most valuable. They also found that tweets highlighting changes to operations (e.g., library hours, network outages, air-conditioning and Sabbatical Report: An Assessment Strategy for the UFV Library 22 heating issues, and emergencies) to be most valuable, although they also greatly appreciated tweets promoting the library’s study support services and events. All academic libraries collect data on library use, although not all libraries collect the same data. Beile, Choudhury, and Wang (2017) explored what this data, when connected to other student data could reveal about student library use and how this data could be used to improve library services. Student data about library use was collected at several service points (online tutorials, study room bookings, library instruction classes, and research consultations) and matched with the students’ academic and demographic data. The data revealed that the majority of students interacted with the library at only one service point. This, combined with findings from a related survey, led them to conclude that most students were unaware of the breadth of library services available. Students can be a difficult audience to reach, and distance students even harder. Albert (2017) surveyed distance learning faculty about their use of library resources and services in their courses, and their expectations for student use of those resources and services. She also asked faculty about their perceptions of students’ research abilities. Almost half the respondents were unaware of library services that support distance learning, and almost 40% had not considered using the library for support of their courses. More than half the respondents did not require students to use library resources. The author used data from this survey to develop unique, personalized messages to promote the library’s services and resources to distance students and faculty. Library services are often most closely associated with public services. Chang-FitzGibbon and Wang (2017), however, advocate for the involvement of technical services librarians in librarywide assessment activities; they note that involving technical services professional in assessment activities can lead to a greater user-centred focus, collaboration across “boundaries,” and innovation by encouraging individuals to take on new roles. Marquez, J. & Downey A. (2015) reported on a project using a service design approach to enable them to see the library and its services through their students’ eyes. They found that students are creatures of habit, but that those habits were tied to specific activities (such as seeking a comfortable chair while reading or a hard-backed chair for writing). They also discovered that they needed to focus more on wayfinding in the library – the lack of signage was confusing for newer library users. From a process perspective, staff reported feeling more open-minded toward their users, and to be more curious, to ask more questions. Sabbatical Report: An Assessment Strategy for the UFV Library 23 Space This thematic area refers to the configuration and amenities related to physical spaces within the library. Nitecki (2011) describes library space assessment as fulfilling one of two purposes: as a precursor to design of a library space, or post-occupancy of the new or renovated space to determine whether the goals for the space have been met. She identifies three main roles of library spaces: accumulator (e.g., of books and equipment), service provider, and facilitator, or “the nurturing of relationships that foster self-directed learning and the creation of new knowledge” (p. 31). Montgomery (2014) explores libraries as informal or social learning spaces. Her library surveyed students to assess changes in their behaviours prior to and following a renovation. They used observation, focus groups, and a survey to determine user needs for the existing space, and following the renovation repeated the survey. Students felt the space met their social learning needs, but wanted more comfortable furniture, and increased food choices and availability. They also expressed a need for more electrical outlets. Citing changes in student needs and the use of library spaces and services, as well as administrative pressures, Andrews, Wright, and Raskin (2016) describe a multi-phase, multiyear project to create flexible, technology-rich spaces throughout their library. Over the course of the nine years, they employed a variety of methods, including surveys, observation, ergonomic evaluations of furniture, interviews, usability tests, photo diaries, ideal-space design exercises, focus groups. Warren and Epp (2016) employed kindness audits to explore wayfinding, space usability, and signage in their library. A kindness audit evaluates the “positivity, usability, and welcoming attitude of a space” (p. 2). An abundance of homemade signs, many in disrepair, led to the development of guidelines for signage and staff training on aesthetics and readability. Acrylic sign holders were installed throughout the library in places where there was permanent signange. The kindness audit also revealed that the number of electrical outlets was insufficient and identified several with dangerous conditions such as exposed wires. It also revealed the need for improved wayfinding. Perhaps one of the most exciting innovations in library space assessment is the development and availability of SUMA (https://www.lib.ncsu.edu/projects/suma), free software developed by North Carolina State University (NCSU) Libraries that enables libraries to collect rich data on space usage using a tablet computer. It has enabled libraries to record observational data about where students sit, the technology they use, the activities they engage in, and more. NCSU Sabbatical Report: An Assessment Strategy for the UFV Library 24 Libraries are also behind the development of the Learning Space Toolkit (https://learningspacetoolkit.org/), a set of tools for designing and sustaining technology-rich informal learning spaces. User Experience/Usability User experience (UX) refers to practical, experiential, affective, meaningful and valuable aspects of a product or service. Usability refers to the ease of use of a particular system or tool. Schmidt (2015) argues that “all librarians are UX librarians” (p. 21) because every decision made has an impact on how users experience the library. He suggests a three-step action plan for getting started with user experience assessment. One method he discusses is the service safari, where library staff take a field trip to another service environment (e.g., café, store, museum, other library) and record their experiences, with the aim of identifying the positive and negative aspects of the service experience (Schmidt, 2012). Boyce (2015) was concerned about decreasing use of the library’s reference desk. Data were available on when and why libraries users approaches the two service desks on the entry level, but no assessment of user satisfaction had been done. They used a secret (or mystery) shopper methodology to engage in unobtrusive observation of the service desks, which were staffed by students and staff. Shoppers would ask a pre-selected question at the desk, observe the experience, then report on the experience and their satisfaction using an online survey. Data revealed that students could use additional training on when to provide referrals, but that customer service was generally good. Tobias and Blair (2015) used web analytics and transaction log and transcript analysis to understand and improve the user experience of distance learners, who are often invisible to the library but who, because of their largely digital interactions, leave rich traces in their use of the library’s services and collections. Fagan et al. (2012) employed two usability tools – screen capture software and a usability testing environment to present a series of web-based tasks – to measure the usability of their discovery service. A pre-test to determine participant’s research habits and a follow-up questionnaire gauged the participants’ impressions of the system’s usability. In spite of technical issues with the usability testing environment, the investigators were able to identify several interface “quick fixes,” as well as questions for further study. Value This thematic area looks at the ways in which libraries assess their value, as described earlier. Sabbatical Report: An Assessment Strategy for the UFV Library 25 The LibValue project (http://www.libvalue.org/home) was a three-year study to test and explore methods to measure academic library impact and return on investment (ROI), as well as to develop a toolkit for academic libraries to assess value. It explored six areas for assessment: teaching and learning; scholarly reading; comprehensive value; the commons environment; digitized special collections; and e-books. The ACRL Value of Academic Libraries project (http://www.acrl.ala.org/value/) offers resources and examples of “assessment in action.” Oakleaf’s (2010) review of the literature on academic library value brought discussions of the topic into sharp focus. In her report, she also lays out an action plan and research agenda for assessing academic library value. The MINES (Measuring the Impact of Networked Electronic Services) for Libraries online survey (http://www.minesforlibraries.org/home) assesses usage and impact of the library’s electronic resources on teaching, learning, and research. MINES for Libraries serves as a complement to COUNTER data, which reports measures such as full-text downloads, searches, and views of journals and e-books (Lewellen & Plum, 2016). The survey “intercepts” every nth request for the library’s electronic resources and administers a brief survey of 3-5 questions. ENVIRONMENTAL SCAN The UFV Library does not exist in a vacuum. It is accountable to the university it serves, and it is also part of a larger academic library community. For this reason, it makes sense to examine the larger context in which the library is situated. A common tool for accomplishing this is the environmental scan. An environmental scan identifies “information about events, trends, and relationships in an organization's external environment” and can help an organization plan a course of action (Choo, 2001, para. 1). Since this project is about charting a course for library assessment in the UFV Library, an environmental scan seems to be a natural activity. The Institutional Context A strong case for library assessment must involve a consideration of the institutional context, because “libraries cannot demonstrate institutional value to maximum effect until they define outcomes of institutional relevance and then measure the degree to which they attain them” (Kaufman & Watstein, 2008, p. 227). Sabbatical Report: An Assessment Strategy for the UFV Library 26 UFV Strategic Plan In 2010, UFV launched its strategic plan, Changing Lives, Building Community (http://www.ufv.ca/irp/ufv-planning/), in which the university adopted three strategic directions: 1) provide the best undergraduate education in Canada; 2) be a leader of social, cultural, economic, and environmentally-responsible development in the Fraser Valley; and 3) be innovative, entrepreneurial, and accountable in achieving our goals. Out of this plan, a number of “foundation” plans and strategic initiatives have emerged which have implications for library operations:        Indigenizing Our Academy: Strategic Planning Indigenous Post-Secondary Education at UFV8 Institutional Learning Outcomes Learning Everywhere: The UFV Education Plan, 2016-20 Strategic Enrolment Management Plan 2014-2019 Strategic Research Plan 2016-2020 UFV India Global Education Strategic Plan 2016-2021 UFV Online 2017 The education plan, Learning Everywhere, is an outcome of an intensive consultation and visioning processed called UFV 2025 (https://blogs.ufv.ca/ufv2025/); the plan outlines five core goals for the transformation of learning at UFV:      Prioritize learning everywhere Commit to flexibility and responsiveness Collaborate across boundaries Develop local and global citizenship Integrate experiential learning As the primary guiding document for academic units, there is an expectation that academic units will operationalize these goals within their particular context. The library has identified 8 This plan was developed in 2007 and pre-dates the strategic plan, but it continues to have relevance to the strategic context, especially in light of the Calls to Action from the Truth & Reconciliation Commission. In response, the Canadian Federation of Library Associations has published a report and recommendations (http://bit.ly/2pzEqrB). Sabbatical Report: An Assessment Strategy for the UFV Library 27 twelve outcomes and 25 strategies to meet these five goals; these are enumerated in Appendix A: Library Strategies for the Education Plan. The Institutional Learning Outcomes identify nine core learning outcomes for UFV graduates. All academic programs are mapped to these outcomes, the first of which is identified as information competency – an outcome that aligns very closely with the library’s core mission, and indeed is one of the areas where we invest a large percentage of our resources. It is not, however, an area where we have focused our energies in terms of assessment. The re-designed Bachelor of Arts (BA 2.0) places an emphasis on information competency, particularly through its writing and quantitative literacy foundations. Additionally, there is a new portfolio requirement in each year of the new BA, where students consider their progress in terms of the core foundations and of the institutional learning outcomes. It seems that there is room for librarians to have a role in assessing student information competency. UFV has also identified five strategic themes:      Community, Justice, and Cultural Engagement Environment and Sustainable Development Human Development, Health and Well Being Teaching, Learning, and Cognition Technology, Modelling and Applications These represent areas of existing and/or potential strengths within the institution, particularly in relation to enrolment management and research. From a library perspective, it would be interesting to undertake an assessment of how closely our operations and resource allocation align with these strategic themes. In addition to the strategic themes discussed above, the Strategic Enrolment Plan provides significant context for the library. Targeted population increases of indigenous students, international students, and transfer students have implications for library services. Even as we discuss ways to reduce the resources being directed toward first-year instruction by developing online tutorials, the anticipated increase in these vulnerable populations would indicate that our desires might be at odds with the needs of our learners. At the same time, the emphasis on flexible learning opportunities points toward the need for more online and self-serve library services and resources. It would seem that this is one situation where a conversation with or stakeholders is necessary. Another area of focus in the Strategic Enrolment Plan is on retention, also referred to as persistence toward graduation. Numerous studies (e.g., Mezick, 2007; Emmons & Wilkinson, 2011; Haddow, 2013; Soria, Fransen & Nackerud, 2013; Crawford, 2015; Murray, 2015; Oliveira, Sabbatical Report: An Assessment Strategy for the UFV Library 28 2017) demonstrate a positive correlation between student retention and library use (e.g., attendance in a library instruction class; use of spaces that promote social interaction and learning; and/or use of the library’s collections). This would suggest that retention, particularly the relationship between it and the library’s services, facilities, and collections, should also be a focus of the library and its assessment efforts. On March 17, I met with Vlad Dvoracek, Associate Vice-President, Institutional Research and Integrated Planning (IRP), to discuss the ability to connect library use data with institutional data (e.g., student achievement, student retention). He noted that much of the data is available through the Business Intelligence Dashboard, which is available to all senior administrators, which would include the University Librarian9. He expressed a willingness to work with us and to make the data available, although he did point out that human resources are limited. I raised the issue of privacy, since we would be seeking to match identifiable data (both library data and student data) on ID numbers. IRP would have no concerns as long as, once we had the matched data, we modified or deleted the ID numbers so that the data could no longer be matched to an identifiable individual, it should be OK. I was referred to Lisa White for specific questions around these privacy issues.10 It was also suggested that we request ethics approval for any projects involving the harvesting of student data. Partnerships Within UFV, there are a number of academic and non-academic units offering services to students that dovetail with the library’s services. Two in particular are the Academic Success Centre and Counselling Services. It would be worth exploring whether there are opportunities to partner with one or both of these units on assessment projects. Academic Success Centre and Counselling Services The Academic Success Centre offers peer tutoring, workshops and other programs focused on helping students develop their academic skills, with a focus on personal learning strategies and writing. Counselling Services also offer workshops on study skills (e.g., reading strategies, time management, presentation skills, preparing for exams), as well as one-on-one counselling in these areas. 9 In a subsequent discussion with the University Librarian, she was unaware of this. However, it is one potential avenue for obtaining the information we need. Further exploration is required. 10 I have since had a meeting with Lisa, and we had a general discussion about privacy concerns. She encouraged me to contact her whenever I plan to work with data linked to student IDs. Sabbatical Report: An Assessment Strategy for the UFV Library 29 Are there opportunities to explore questions of impact and value of each of our services on students who use one or more of them? For example, does “doubling up” on services – peer tutoring, attending workshops, using the library’s collections – improve student success or retention? Information Technology Services (ITS) Another potential partner is Information Technology Services. The library is a heavily-used site for campus technology, including computers, printing, and photocopying, yet we have very little data about this. We can ask for data about printing in the library (e.g., how many users, how many pages, at which print station), but we have no such data about computer use. In a meeting on March 30 with Darin Lee (Chief Information Officer), Bryan Daniel (Manager, IT Infrastructure), and Bryan Wilkinson (Manager, IT Support Services), I discussed the potential for getting access to more data about technology use in the library. There are a number of questions we have about technology use that could help us with scheduling, services, and more. Here are several examples of questions we might want to ask. Do we have enough computers in the library to meet student needs? And do we have the right ratio of open access computers and computers requiring a login? These questions would be greatly informed by knowing more about individual sessions: How many individual computer sessions are there in a day? A week? A month? What is the average length of a session on our computers? Do we offer the right mix of software on our computers? Should we have, for example, statistical analysis or graphic production software available on computers outside the lab? Currently we have three groups of desktop computers in each campus library:    computers for web browsing (no login required); computers offering web browsing plus the Microsoft Office suite (login required); and a drop-in lab with computers (login required) offering the Student Lab Software Set (http://www.ufv.ca/its/student-tech-guide/labs/software/). All computers offer printing. Knowing more about what students are doing on these computers (from a software use standpoint) could help us better understand their software needs better: How much and how often are students using the installed software on our computers? And what is the ratio between use of this software and activities such as web browsing? When is the best time to have a lab monitor available? Sabbatical Report: An Assessment Strategy for the UFV Library 30 Currently, ITS provides a lab monitor for 16-20 hours a week during fall and winter semesters. The current schedule is based largely on a guess about when our busiest times are likely to be. But if we knew, for example, how many individual computer sessions there are in a week (on average), or the variations throughout the semester, as well as when we experience the greatest density of computer users, we could better estimate the best times for a lab monitor to be on site – and this might even be in the evening or on weekends, when we don’t currently have coverage. Do we need to have different configurations or software mixes in each campus library? We know from our space use study in 2014-15 that there are differences in how each of our campus libraries is used. It would be logical to assume that, given the differences in programs offered on each campus, there are different patterns of use for our computers, as well. Having data on usage would help illuminate what the differing needs might be. ITS noted that they have been exploring options for installing lab management software; our needs would seem to support the business case they are developing. They were generally supportive of the needs that I expressed and invited the library to develop a list of requirements (e.g., metrics) so they can figure out what they need to do in order to support our request. Site Visits I chose UBC Okanagan and Thompson Rivers University as the sites to visit. Each of them has a librarian with the word “assessment” in the job title, and they have a history similar to that of UFV. I had previously made contact with the assessment librarians in preparing a proposal for the BC Library Conference.11 In preparation for my visits, I emailed a list of interview questions (see Appendix B: Site Visit Interview Questions) to my contact at each institution. I encouraged each of them to invite others at their library to participate in the visit. Each visit was scheduled for 3 hours. 11 There were also practical reasons for choosing these sites; I would have accommodations and be able to visit family. Sabbatical Report: An Assessment Strategy for the UFV Library 31 UBC Okanagan Participants Laura Thorne – Communications, Marketing and Assessment Librarian Robert Janke – Associate Chief Librarian Date of visit: April 4, 2017 #morelibrary was fantastic; it’s the spark that made we want to become the person in charge of assessment. The campaign ran in the fall of 2013, and I became the assessment librarian in May 2014. I really enjoyed combing through the data, trying to figure out how to understand what the students wanted (Laura Thorne). General Context UBC Okanagan is part of the UBC system, with the main campus in Vancouver (Point Grey). The Okanagan campus has a “dotted line” connection to the Vancouver campus: it has its own administrative structure and Senate, but it benefits greatly from resources – software, expertise, and more – available from the primary campus in Vancouver. The UBC Okanagan Library has a similar relationship to the “main” UBC library. It has its own Chief Librarian and its own strategic plan (http://library.ok.ubc.ca/about-us/strategic-plan/), although it the plan is guided by strategic plans from the UBC (Vancouver) Library and its immediate parent institution, UBC Okanagan. UBC Okanagan’s Chief Librarian has been working to introduce agile planning12 into the strategic planning process. UBC Okanagan Library has a separate assessment plan that is linked to the library’s strategic plan. Planning for assessment is challenging, because so much of it is not strategic. This plan is not publicly available. Assessment Infrastructure The position of assessment librarian was created in 2014 as the result of a review of the librarians’ portfolios. Focus on the areas of assessment, communications, and marketing was 12 The concept of agile planning comes from software development and relates to rapid prototyping and development of software. The application of agile planning to strategic planning has taken hold in the business world, in order to “accelerate business model, product, or service innovation” (Leberecht, 2016, para. 2). Sabbatical Report: An Assessment Strategy for the UFV Library 32 identified as a need, and a position combining the three functions was created – the communication and marketing functions were seen as being tightly connected to assessment. The assessment librarian also has liaison responsibilities, and, since September 2016 has been the acting Innovation Librarian (2 days per week). The Innovation Library is located in downtown Kelowna and provides resources to the community, in addition to serving as a “base camp” for faculty and staff working with the local community. This position receives support from and services from the Assessment Office in the UBC Library on the Vancouver campus. The Assessment Librarian there travels to the Okanagan campus several times a year to conduct focus groups and collect other qualitative data. Large assessment projects are supported by a team; the chair of the team is not necessarily the assessment librarian, but the assessment librarian is usually part of the team. There is a UBC-wide advisory group for assessment; the Assessment Librarian from the Vancouver campus is the chair; participants from UBC Okanagan include the assessment librarian and the collections librarian. The group used technology to facilitate its monthly meetings, since it is dispersed geographically. In terms of data management, they admit that they are not very good at it – data is spread out all over the place, and there is no centralized mechanism for locating it. They tried to do a data audit in 2014, but gave up because it quickly became unruly, especially in relation to collections data. Data is stored on shared network drives and personal drives, in a variety of different formats. Additionally, there is a need to consider data security. For example, they track use of their Inclusive Technology Lab, but because of sensitivities around who uses the lab, there is a need to restrict access to the data. They need to develop policies around access to and storage of restricted data. The UBC Library has adopted Tableau Server and this has helped them take a “giant leap forward.” Data that was previously siloed or closed off is now available to anyone who needs it. There is a mechanism in place for requesting the addition of datasets and the development of dashboards – the Library has designated a Tableau “team” with the technical expertise to connect to a variety of datasets and to develop the necessary APIs. About 40% of new requests for data come from UBC Okanagan. Tableau dashboards are available on campus only. Some of the major datasets in Tableau are ILL, DeskTracker (reference statistics), and COUNTER data (e-journal usage), laptop checkouts, and room bookings. They are also using Tableau to explore data from a scenario perspective (e.g., to find out when it’s likely that 80% of the laptops would be checked out). Sabbatical Report: An Assessment Strategy for the UFV Library 33 UBC provides a number of institutional resources that support assessment efforts in the UBC Okanagan library. There is an institutional survey tool, and a site license for nVIVO (for qualitative data analysis). The institutional research office (OPAIR) provides expertise and support for campus-wide assessment planning and reporting – they’re busy, but “amazingly helpful.” Additionally, UBC provides communication support for getting the word out about library projects and initiatives. They have also received financial support from the development office. A Culture of Assessment? There is a sense that in the past three years since the introduction of a dedicated assessment position the UBC Okanagan Library has, if not completely achieved a culture of assessment, at least moved toward one. There is a strong core of evidence-based decision making within the library, encouraged in part by the Associate Chief Librarian, who in his previous role as Collections Librarian would provide usage data in advance to aid in collegial decisions on large purchases, and a Chief Librarian who comes from the library vendor world and bring a datadriven perspective to the library. In their weekly meetings, it is becoming common for librarians to think about how they will assess a given project or activity, or collect the data needed to make an informed decision, and the library is moving toward an evidence-based learning model13. At the time of writing, the assessment librarian position is vacant, and the posted position may not include assessment as part of the portfolio. There is a sense that there is no longer a need for a centralized coordinating role. Most departments are now responsible for carrying out their own assessment, with the assessment librarian simply offering support, so assessment activities have become decentralized. The Tableau server makes is much easier to find the needed data. Large assessment projects are managed by a team or committee. The UBC Library in Vancouver has an Assessment Office with a dedicated assessment librarian that could continue providing the leadership and overall coordination for assessment; they’ve established a good working relationship with the Assessment Office, and expect that to continue. 13 Evidence-based learning (EBL) refers to the “approaches, processes, and strategies that have been empirically demonstrated to produce learning outcomes” (Cranney & McDonald, 2013, p. 1185). A model of EBL would be derived from those research processes that are characteristic of scientific approaches, including comparison-group methodologies. Sabbatical Report: An Assessment Strategy for the UFV Library 34 However, they also acknowledge that this centralized role has served an important function in ensuring that the data speaks for itself, but always in the context of a story. It offers a kind of neutrality that could be lost. Assessment Activities Each year, the library undertakes a major assessment project. They are just wrapping up a review of Access Services, the previous year they completed a Public Services review, and the year before that their #morelibrary campaign and library expansion project (http://library.ok.ubc.ca/about-us/morelibrary/). The next large assessment will be structured around instruction, and they will likely partner or consult with the Centre for Teaching and Learning. The assessment plan provides a schedule for regular or ongoing data collection, as well as reporting requirements. Instruction assessment has been rather chaotic – there has been a lot of turnover in the portfolio, but in March they hired a dedicated librarian and they expect that assessment will become a more regular part of this portfolio. They have been conducting space assessments for making decisions about the new library building that is under construction. They use Google Analytics to collect data on their web site, and have licensed Crazy Egg (https://www.crazyegg.com/) to help them visualize user activity on their web site. This data informed a redesign of their web site in 2014 and a reorganization of the site in 2015. Collections assessment is a regular and ongoing activity. Much of the data they need is available from the Tableau server. They are currently investigating whether to close the library building earlier (currently it’s open until midnight). And they recently completed a 2-week observational study of how students are using the library in order to inform decisions about the new library building. The UBC Library has administered the LibQUAL+ survey every three years since 2007 (the last time was in 2016), and the UBC Okanagan Library has been part of it. With the 2016 results, they translated the scores into grades, to make them more understandable. They’re not sure they’ll participate again in LibQUAL+, however; they feel the instrument is problematic. While not directly related to library assessment, UBC Okanagan administers surveys that may inform library decisions:   National Survey of Student Engagement (annual) It is unclear whether UBC has included the optional information literacy questions. Workplace Experience Survey (Ipsos-Reid, every three years) The survey includes a question about the library as an institutional resource. Sabbatical Report: An Assessment Strategy for the UFV Library 35  UNIFORUM (annually, initially for three years) This survey has two components: 1) employee activity tracking; and 2) internal satisfaction. The library receives a report comparing their results to other, generally larger institutions. They use a variety of methods for collecting data. In addition to tools like DeskTracker, Google Analytics, and Crazy Egg, they employ non-technological methods. Unattended white boards, where they post a specific question, have been effective at gathering user feedback. As part of UBC library system, the Okanagan library reports out to the administration in midApril for the Vancouver campus’ report to Senate. The UBC Okanagan Senate has a different cycle, so they report out to their campus Senate in December. The UBC Okanagan Library generates a variety of internal reports that are forwarded to the library. They include library data in their communications to the UBC community, whenever possible. Data is used extensively for internal decision-making, but it is also used for political purposes – for example, to justify their decisions to external stakeholders when they question whether a particular activity is something the library should be doing. Thompson Rivers University Participants Amy Paterson, Electronic Resources and Assessment Librarian Penny Haggerty, Collection Services Librarian Elizabeth Rennie, Library Instruction and Outreach Librarian and Librarians' Department Chair Peggy Lunn, Library Manager Date of visit: April 6, 2017 General Context Thompson Rivers University has a shared history with UFV, with its roots in the network of B.C. community colleges. Like UFV, TRU is a multi-campus institution, with two main campuses (Kamloops and Williams Lake) and six regional centres. TRU is also the home of BC’s former Open Learning Agency, so there is a strong distance learning component to their operations. The library has an unusual staffing model for academic libraries in BC, in that in addition to librarians and library technicians, it also has a component of library clerks who perform many of the lower-skilled tasks at service points. During the interviews, it became apparent that the imminent layoff of the library clerks because of the elimination of these positions, and their replacement with library technician positions, was a source of great tension and concern among library employees. Sabbatical Report: An Assessment Strategy for the UFV Library 36 Assessment Infrastructure The position of Electronic Resources and Assessment Librarian was created in 2017; the incumbent joined the library in July 2017, in her first position with assessment responsibilities. There is an ongoing tension between shepherding formal projects (with about 20% of the time focused on electronic resources) and dealing with more emergent and immediate issues around electronic resources (such as the need to troubleshoot and fix access problems). They are in the process of implementing CORAL ERM (http://coral-erm.org/), an electronic resources management portal that, as one of its modules, provides detailed usage reports. They are treating this as an assessment project, investigating questions such as how it will impact workflows (e.g., the acquisition of electronic resources, negotiating licenses). They have a subscription to LibWizard form Springshare, which they use to administer surveys. They rely on LibWizard for field and cross-tabular analysis of survey data, but also download the data to Excel or SPSS for further analysis. The library is also participating in COPPUL’s SPAN project (http://www.coppul.ca/programs/shared-print), which provides them with access to OCLC’s GreenGlass14 (http://www.oclc.org/en/sustainable-collections.html), a collection analysis and visualization platform. It gives TRU the ability to benchmark their own collections against those of other participating COPPUL libraries. The TRU Library does not have a strategic plan or an assessment plan. A Culture of Assessment? The position of assessment librarian has been established for less than a year, and there is no sense of a culture of assessment. Assessment efforts have been very distributed, dependent largely on individual services to shepherd assessment efforts. As a result, assessment activities have been uneven and not well-coordinated, and efforts have focused on statistics gathering, rather than formal assessment. For example, the librarian on the Williams Lake campus focused on justifying the continued existence of her position through data collection exercises. On the Kamloops campus, assessment consisted primarily of gathering statistics for external purposes (e.g., the annual 14 I asked our University Librarian why we didn’t participate in this project, and it came down to cost. I received a demonstration of GreenGlass, and it’s clear that the greatest benefit is being able to compare holdings to other library collections. We could accomplish much of what GreenGlass does in terms of analyzing collections with tools we currently have available, but we don’t have that ability to benchmark our collections against others, unfortunately. Sabbatical Report: An Assessment Strategy for the UFV Library 37 CPSLD survey). And, of course, there were assessment activities leading to the development of a new staffing model. Because this position is so new, there isn’t a clear picture of what role this position play. However, there is a belief that the larger goals of assessment – seeing the bigger picture, keeping the library’s strategic goals on track – are an important aspect of the role. There’s also acknowledgement that there’s opportunity for assessment in every aspect of the library’s operations. Given the uneven distributed model of past assessment practices, however, the there’s concern about “stepping on toes,” as well as navigating processes that are not welldocumented. Assessment Activities Assessment activities involving user input have been largely survey-driven; for example:   An evaluation of library instruction classes (some librarians use it more than others); and A survey of reference service providers to identify service gaps and training needs. Capilano University Capilano University was not on my list of institutions for a site visit, largely because they don’t have an assessment librarian position. But while there is no librarian with the term “assessment” in his or her title, the library has created room for assessment in the position of Student Experience Librarian. Capilano University is most like UFV in terms of its history and current status, so I felt it was important to include them in this discussion. I learned about Capilano University’s assessment efforts in a session at the BC Library Conference (Alekson, 2017). I was not able to conduct an interview, but there were two items of note in the conference program that I wanted to capture in this report: 1) The library’s assessment plan (http://www.capilanou.ca/library/use/about/) is updated annually. The 2016-17 plan enumerates five service outcomes, and provides a data plan (metric, data collection method, and benchmarks) for each outcome, a communication plan, and an action plan that assigns responsibility and a deadline for each action. This might be a good model for us to follow as we work toward developing an assessment plan. 2) The library has created a Library Dashboard (http://libguides.capilanou.ca/dashboard/ see Figure 3) that includes data on room bookings, student learning, service desks, collection use, collection development, and their discovery service. They use a LibGuide as the framework for the Library Dashboard, and Tableau has been used to create the Sabbatical Report: An Assessment Strategy for the UFV Library 38 visualizations. It offers several useful ideas about how to organize and present library data. Figure 3. Library Dashboard from Capilano University ASSESSMENT IN THE UFV LIBRARY Library assessment comprises a cycle of activities, from identifying the purpose or need, to identifying a method, then collecting, analyzing and interpreting the data, and finally using the results to fulfill the original purpose or need (Connaway & Radford, 2014). The UFV Library, like most academic libraries, has a long history of gathering detailed statistics – on collections (e.g., number of items purchased or donated to the collection, circulation of the print collection, number of full-text articles downloaded), services (e.g., number of reference questions asked, number of classes taught, number of items sent or received via interlibrary loan, number of web site page views), and facilities (number of people using the library, number of group study room bookings). The library currently has no positions with primary responsibility for assessment, which means that assessment activities are often ad hoc, and why most of the activities centre around easily obtainable data, such as data that can be pulled from systems already in place, or surveys where the data can be collated automatically (e.g., using survey software such as Fluid Surveys). The following discussion summarizes a range of assessment activities that the UFV Library has engaged in to date. Sabbatical Report: An Assessment Strategy for the UFV Library 39 Annual Statistics The library is affiliated with two partner organizations – the Council of Post-Secondary Library Directors (CPSLD) and the Council of Prairie and Pacific University Libraries (COPPUL) - that survey libraries annually on their activities and compile the data into a report. The data, when compiled, is usual for benchmarking library activities against other libraries in the region. These reports can be found online, but copies of them are also stored on our shared network drive:   CPSLD Reports: http://cpsld.ca/home/statistics COPPUL Statistics: http://www.coppul.ca/about-us Additionally, the library contributes statistics on library collections and services to the annual UFV Factbook. Much of the data reported externally come from the compilation of monthly public services and annual technical services statistics, which can be found on our shared network drive. Library Instruction Statistics The library collects detailed statistics on library instruction each semester. The data is compiled largely to aid in instruction scheduling, but has also been shared to aid in discussions around the instruction program, as well as to initiate methods for balancing individual library instruction workloads. In 2010, I created an Information Literacy Fact Sheet aimed at faculty that included data on library instruction, but that is one of the few instances of using this data to communicate with stakeholders. Reference Statistics In 2012, we switched from recording reference transactions as tick marks on paper to using Reference Analytics from Springshare. We collect data on reference transactions at every service point, as well as data on reference appointments and email questions. We used to track AskAway transactions, but stopped that practice in 2016 – this data is now tracked separately. It’s unclear how this data is used internally. Externally, the total number of transactions is reported. Reference Analytics, as the name would imply, provides a detailed analysis of reference transactions. Sabbatical Report: An Assessment Strategy for the UFV Library 40 LibQUAL+ Surveys The LibQUAL+ survey measures users’ minimum, desired and perceived service levels across three dimensions: Affect of Service (e.g., customer service), Information control (e.g., collections), and Library as Place (e.g., library spaces). The library administered the survey independently in 2005 and 2006, then joined a consortium of Canadian libraries in 2007; the consortium administers the survey every three years, and the UFV Library participated in 2007, 2010, and 2013. Following the administration of the survey in 2005, the library worked extensively with the results, holding a staff retreat to develop an action plan, and developing several committees to develop recommendations around the data provided. In 2006, the library published a response the survey results on the library’s web site (http://www.ufv.ca/library/old_news/latestnewslibqual-results-2006/). A summary of the 2010 results was drafted, but never finalized and shared. And at a 2014 staff retreat, I prepared a summary of results from the 2013 study. In it, I sliced the data in several ways (by campus library, by user group, by program) to try to present a more nuanced view of the data. I also combined it with data from other sources, such as library instruction statistics, to look at correlations. Many libraries, including UBC, are choosing to make both the results book and a cleaned version of the raw data available publicly. Value of the UFV Library In 2011, Margaret Friesen, Assessment Librarian at UBC, gave a presentation entitled “Good Assessment Starts Before You Begin” for the Academic Librarians in Public Service (ALPS) December meeting. She concluded her presentation by issuing a challenge to attendees to go back to their libraries and undertake some kind of assessment activity. She had shared the return on investment (ROI) calculation that she had completed for the UBC Library, and I was inspired to do the same. The results of this are shown in Appendix C: Value of the UFV Library, 2010-2011. Space Use Study The idea for the space use study came about unexpectedly in late summer of 2014. The anticipated opening of the Student Union Building in 2015 and an anticipated renovation of the Abbotsford Library’s first floor presented an opportunity to conduct an assessment of library use. We formed a team, developed a protocol, and started collecting data in fall of 2015. Data were also collected in the Winter 2015 and Fall 2015 semesters. Sabbatical Report: An Assessment Strategy for the UFV Library 41 We had hoped to use Suma, web-based software for collecting space use data developed at North Carolina State University, but I was unable to get it working. In the end, a paper instrument served us quite well. The data from the study was analyzed using Tableau, and the results shared with the library renovations and study teams. This was a purely observational study – no interaction with our users was undertaken. While the data we did gather was informative and sometimes surprising (for example, our users preferred group study spaces in Abbotsford, but in Chilliwack the study carrels were more popular; and Sundays in Abbotsford are one of our busier days), it would have been even richer had we also interviewed students. Once the renovations are complete, I’m hoping to carry out a mixed methods post-occupancy study (an expanded version of the original study). Library 2025 In 2015, the university launched UFV 2025, a process to envision UFV 10 years down the road. The library adopted this timeframe for its own visioning and strategic planning process, calling it Library 2025. In addition to a day-long library retreat and visioning exercise, we gathered input from students and faculty. To gather student input, we set up a table in the Student Union Building on the Abbotsford campus and in the atrium space in Chilliwack’s A building and invited students to “imagine the ideal library of the future.” We ran a PowerPoint presentation with photos of different types of spaces on a continuous loop, and we encouraged students to write their thoughts and ideas on post-it notes. In the PowerPoint, we asked a variety of questions to inspire a vision of the future library:       How are you using it? How does it make you feel? What are you doing while you’re there? What can you get from it? What kinds of spaces are you using? Where are you using it? We received 105 notes containing 192 separate ideas. Some were obviously intended to be humorous (at least we hope – “I would use the library to pick up girls”), but most were Sabbatical Report: An Assessment Strategy for the UFV Library 42 thoughtful. Students commented on collections, food, hours, technology, services, and amenities (comfort was important - “big comfy couches,” “yoga/meditation room”). In the 2016 winter and fall semesters, we gathered faculty input from each academic department. Each visit involved the University Librarian, the liaison librarian, and (usually) a library technician, whose role was to take notes. Visits were scheduled for about 30 minutes during the department’s regular monthly meeting. Prior to the visit, faculty were given a onepage (front and back) background document on the library’s services, collections, and facilities, and invited to respond to the question, “How do you expect to engage with the UFV Library in ten years?” We provided the following prompts: Library Services   How many of these services have you used? What services could we offer you to help you do your job, now and in the future? Library Collections     How has your use of the UFV Library’s collection changed in the past 10 years? How are you accessing the literature of your discipline today? What kinds of information resources would you expect the UFV Library to be providing in 10 years? How often do you sign out library materials? How often do you access online resources licensed by the library? How about your students? Library Spaces   How often do you visit our campus libraries? What would bring you into the library ten years from now? The data were analyzed using Dedoose. Not surprisingly, faculty comments were most often related to collections (particularly print materials and databases). Other comments covered topics such as information competency and library instruction, and student engagement. Abbotsford Library Renovations Over the summer of 2017, the first floor of the Abbotsford Library is undergoing a transformation. The Library has been working with UFV’s Project Office, as well as an architectural firm, Urban Arts (http://www.urban-arts.ca/). In December 2016, Urban Arts facilitated a half-day design workshop. Participants included faculty, students, and library employees. We worked in small groups to address a series of Sabbatical Report: An Assessment Strategy for the UFV Library 43 questions posed by the architects. The architects provided each group with a stack of photos from libraries, and asked each group to select the photos that represented their vision for the transformed space. This data was used to inform the initial plans for the renovation. In February 2017, the architects revealed their plans and invited public comments, seeking to gain feedback from a broad range of library visitors. The input gathered during the process further informed the design of the new space. One faculty participant in the design workshop led by Urban Arts commented afterward he now had a really good idea of what librarians wanted for the renovation. The implication, as I understood it, was that the library employees attending the workshop had dominated the discussion. That our library employees have strong feelings about the renovations is understandable. The library is, after all, our workplace. But I think it’s important to make a distinction between the space we occupy in the course of our work (e.g., service desks, offices, and our work room) and the spaces that are primarily occupied by our users in the course of their work. Final Thoughts Connaway and Radford (2014) observe that informal assessment, based largely on anecdotes and casual observation, used to be the norm in libraries – but is no longer. The focus now is on more robust and formal assessment practices: data-driven, evidence-based, using rigorous and accepted methods. As library employees, we hold a great deal of knowledge (both formal and informal) about how our resources, services and spaces are being used. This knowledge is important in decisionmaking, but we also need to make room for the experiences and perspectives of our users, because they have an important voice, too. This is going to require a shift in our culture around decision-making, as well as dedicated resources. THE DATA AUDIT Academic libraries and the systems they use generate a considerable amount of data. From the ILS creating a record every time an item is checked out to counting every reference transaction or keeping track of the collections budget, many of a library’s activities and transactions are recorded as a statistic. Sabbatical Report: An Assessment Strategy for the UFV Library 44 The purposes for doing so can be varied, from calculating a library’s value (Oakleaf, 2010) to making informed decisions about improvements to library services and operations (Zaugg, McKeen, Hill, & Black, 2017). However, as Durrant and Chase (2016) observe, there are a number of obstacles to managing all of this data that is collected: Data collection responsibilities are often decentralized in libraries, making it difficult to know exactly what data are collected, who is responsible for collecting them, and how they are used. Further, data are stored in multiple places, including library systems, vendor administrative portals, shared storage spaces, and on individual staff computers. This can make locating data a complicated and time-consuming process. Lastly, making use of data requires specialized knowledge of data management, data analysis, and data visualization techniques and best practices, which are not skills librarians typically receive training in. These challenges are particularly acute for small academic libraries that often do not have a full-time position devoted to assessment activities. (p. 558) From my own experience, I know that these barriers also exist in the UFV Library. In the regular course of my work, I both generate and seek data. Most of the time, I know where to find the data I generate, but I’m not always seeking data that I have generated, and sometimes I’m not even sure that the data I’m seeking exists. And if it does, is it available on our wiki or shared network drive, where I can access it? If so, where is it likely to be? If not, who is likely to have it, and can I get access to it? Is it in a form that I can use for my particular purpose? A data audit or inventory can identify what data already exists, in what form it is available, where it can be found, how it is collected, and how it is commonly used (Zaugg et al., 2017). It ensures the ongoing “accessibility, usability, and manageability of data and reports in an effective and efficient manner” (Ogier, Hall, Bailey, & Stovall, 2014, p. 106). A data audit, then, seems like a natural activity to undertake as part of my sabbatical project. The following sections outline the process I followed and my findings from the audit. Process A comprehensive data audit can be time-consuming and involve surveying librarians and library technicians, as well as looking for evidence of data in all of the library’s systems. At this stage in the library’s assessment “program,” and given the limited time frame of this project, I decided to limit my explorations to data that could be found in shared digital spaces – primarily our wiki, which for several years has functioned as our intranet, or internal communication tool, and our shared network drive (“G: Drive”). I hoped that this inventory could inform a process for a more comprehensive data audit at a later date. Sabbatical Report: An Assessment Strategy for the UFV Library 45 Conducting the data inventory involved a very deliberate and systematic process of crawling through the various shared digital spaces in use by the library15. I examined every page on our wiki (http://ucfvlibrary.pbworks.com/) and every file on our shared network drive (commonly referred to as the “G: Drive”) to identify documents that focused on numeric data that had been compiled, analyzed, or reported in some way, or identified outcomes that, if operationalized, would generate data (e.g., the library’s response to the 2016-2020 Education Plan goals). For each document that met the criteria above, I recorded the filename or page title, its location (URL or folder path), the date it was created and/or last modified, ownership details, and descriptive notes. The results of this inventory can be seen in Appendix D: Data Inventory Findings The data inventory revealed a number of findings about our data practices. 1) We like to count. Much of our data reflect counts of things, people, and transactions. This is not unusual activity, as most of our systems are designed to count, and to do so very effectively. But the emphasis on quantitative data, and the lack of more qualitative data, is striking. 2) Much of the data lacks context. There’s not always a clear relationship between the data itself and the purpose for which it was originally collected or compiled. Much of the data on the shared drive is spreadsheet data, and it does not always include explanatory text. Similarly, if we start from reports based on data, it’s not always possible to trace backward to find the underlying data that is included. 3) Data is often aggregated. Some of our data is readily available in disaggregated format, because we all have access to the systems that generate it. However, other data is much harder to come by in disaggregated format – COUNTER data is a good example of this. There are a number of reasons why one might aggregate this data in some way, but we don’t all necessarily want the same aggregated view. There needs to be a mechanism for storing the data in its original format, as well as in an analyzed or aggregate form. 15 The audit undertaken for my project did not include an examination of the reports provided by systems such as Sirsi and Relais – these are extensive, and not all of them are used. Such an audit would likely entail conducting interviews with librarians and staff to find out which reports they run and for what purpose. Data from these systems, however, often form the basis of the files and pages in our shared digital spaces, which could be an indication of the data within those systems that is considered most useful or valuable. Sabbatical Report: An Assessment Strategy for the UFV Library 46 4) We don’t often ask our stakeholders for their input. Given the number of decisions we make that have a direct impact on our users, it’s surprising (and perhaps disturbing) that we don’t ask for their input more often. Aside from the LibQUAL surveys and the very occasional user survey on library hours or e-book use, the evidence points to a paucity of user-generated data. While we have made greater efforts in the past year to change this (e.g., web site usability surveys, student and faculty focus groups on the library’s strategic planning process), collecting and analyzing this data is resource-intensive – without dedicated resources, it’s unlikely we’ll step up our efforts in this area. 5) There seems to be a strong relationship between the location of the data files and the person compiling it. Our shared network drive, for example, is organized around the library’s functional areas (e.g., administration, circulation, cataloguing, reference, systems), fostering a sense of ownership of those spaces by those who work in those areas. This has led to a tendency to store the files we are responsible for creating in the folders we “own” on the shared drive. As a result, we find our annual public services statistics stored inside the Circulation folder, even though the data relates to all of public services, including instruction, reference, interlibrary loans, and more. And data on 6) Some of the data is very resource-intensive to compile. Some of the data is touched by many hands as it is compiled (e.g., monthly public services statistics), while other data is generated by our library systems (e.g., Sirsi, Relais) but requires extensive formatting to make sense of it (e.g., annual technical services statistics). 7) We use the same reports over and over again. Our annual reporting processes (e.g., to COPPUL and CPSLD or to the UFV Factbook) use data that is updated each year. Librarians run the same reports in Sirsi for the collection analyses that we prepare for program reviews. Each time the reports are run, we are looking for updated data, but it’s a very inefficient use of resources. If it were easier to get the data in the format we need it, would it be easier to make operational decisions? 8) We avoid data with personally identifiable information. Libraries have a long history of respect for patron privacy, and we must operate within the bounds of BC’s Personal Information Protection Act, but there are good reasons for collecting data that can be tied to an individual. It’s probably a good thing that I didn’t find this type of information in our shared digital spaces, but we should have a discussion about collecting personally identifiable data – when it’s appropriate to do, and how we should manage the data. LIBRARY METRICS: INPUTS, OUTPUTS, AND OUTCOMES There is no standard definition of the term “library metric,” but it is commonly used to identify those aspects of a library that can and are measured; in other words, quantitative measures of Sabbatical Report: An Assessment Strategy for the UFV Library 47 a library’s processes and programs. Traditionally, library metrics have focused on quantitative input and output measures. Dugan, Hernon, and Nitecki (2009) define inputs and outputs as follows: Inputs are the resources used to support the library’s infrastructure: collections, staffing, the physical facility, and installed information technologies. …Outputs are the direct products of program activities. Outputs identify how much work is performed and/or how many units of service are provided (e.g., the number of books circulated or the number of reference questions answered). (p. 15) Input and output metrics provide libraries with the ability to identify trends over time. By themselves, they are capable of telling a certain story. For example, in academic libraries they are often used to tell a narrative of transformation: from print to digital (in terms of expenditures, collection sizes, and collection use), and from a user focus on services to a user focus on facilities. Dugan, Hernon and Nitecki (2009), list more than 1,000 individual library metrics as examples of the types of metrics of relevance to academic libraries; they themselves state that the list is not intended to be exhaustive. The metrics, reported as either numbers or ratios, measure a range of activities and aspects of the library’s processes and programs, including:         Collections Communities Served Expenditures Facilities Information and Learning Commons Interlibrary Loan and Document Delivery Partnerships Process         Quality Satisfaction Services Staffing Stakeholders Technology Time Use Input and output measures, however, often fail to evaluate the value or impact of such metrics on a library’s users or stakeholders. In addressing the value and impact of our processes and programs, we often need to turn to qualitative measures. One such type of measures is referred to as an outcome which, in academic libraries, can be described as those metrics that “emphasize results-oriented goals directly related to education, research, and service processes” (Dugan, Hernon, & Nitecki, 2009, p. 15). Library data may address some outcomes, but deriving such measures often involves a range of methodologies, from asking our users and stakeholders directly to using surrogate measures from sources outside the library, such as student information systems or institutional research Sabbatical Report: An Assessment Strategy for the UFV Library 48 data. Oakleaf (2010) identifies a number of categories of surrogate measures that can be correlated with library data in order to measure the impact of the library on a library’s stakeholders, among them:       Student retention (e.g., fall-to-fall) and graduation rates Student achievement (e.g., GPA; test scores) Student learning (e.g., learning assessments; faculty judgments) Student experience and perception of quality (e.g., self-report engagement studies; senior/alumni studies; help surveys; alumni donors) Faculty research productivity (e.g., number of publications; promotion/tenure decisions) Faculty teaching (e.g., integration of library resources and services into syllabi, course web sites, lectures, labs, texts, reserve readings; faculty/librarian collaborations; cooperative course, assignment or assessment design) Unfortunately, using surrogate measures means that we can’t determine causality; we can only determine that a relationship between library use and, say, student achievement exists and that it is of sufficient strength as to be statistically significant. Connaway and Radford (2014) offer several guidelines for assessing outcomes. The focus of the assessment should be on services and resources that will improve the experience for users, relate to inputs, and seek to identify best practices. Additionally, assessment should make use of a variety of methods to triangulate and substantiate conclusions; it should focus on a small number of outcomes –it need not address every aspect of the service or resource; and it should be a continuous process. The focus of the UFV Library has been mostly on inputs and outputs, but there are a few examples of outcomes assessment. Appendix E: UFV Library Metrics offers a list of more than 260 input, output, and outcome metrics identified from the documents discovered in the data inventory (see Appendix D: Data Inventory). EXPLORING ASSESSMENT TOOLS AND SYSTEMS At least 50% of my time was spent exploring several different software environments for data analysis, visualization, and management. Each system required an investment of time to learn how to use it effectively (or to at least make the attempt to use it effectively), as well as time to work with real data and determine the best uses of the system or software. Sabbatical Report: An Assessment Strategy for the UFV Library 49 The systems I chose to explore met one or more of the following criteria:16 1) We already license them, but they are underutilized (BlueCloud Analytics, Dedoose). 2) We license other products from the same vendor (BlueCloud Analytics, LibInsight). 3) They are commonly in use by other libraries and have a good support community (BlueCloud Analytics, Tableau, LibInsight). 4) They are free or inexpensive (Tableau, Dedoose). 5) They fill a niche not filled by other software already available in the library (Dedoose). The discussion below describes my experiences with these environments. BlueCloud Analytics One of the first tools I started playing with is BlueCloud Analytics (BCA). We’ve had access to BCA for a little over two years, but we have not made much use of it. BCA is a web-based software environment that analyzes and visualizes data from our ILS. Unlike other ILS reporting tools we have access to (Workflows reports, Director’s Station, the API), BCA does not limit which data in our system we can access – it provides access to every data point. The data reference guide is extensive (98 pages long), and it’s not yet complete. One of the drawbacks to BCA at this point in time is that it doesn’t yet include acquisitions data, although it has been promised for quite some time.17 Another drawback is its interface – compared to Tableau (see below), it seems prehistoric and impenetrable. 16 I had also intended to explore Google Analytics (GA) and its recently released Data Studio (both free). We use Google Analytics currently to track usage on our web site and four hosted systems, but there are features we are not yet using that could be useful. And with the redevelopment of our web site and the retirement of two of the systems we track with GA, it’s an opportune time to explore how we could make better use of this tool. But there was only so much time available, and four months goes by quickly. 17 The June 30, 2017 update to the product roadmap indicates a September 2017 release of three new data sets: acquisitions (vendors, orders, funds, and invoicing data), serials, and collection development (selection, acquisition, and allocation of items). The July 2017 release will include several enhancements and upgrades, including HTML5 dashboards, new visualizations, and the ability to share and import dashboards, which should improve the interface and look of the reports and dashboards. Sabbatical Report: An Assessment Strategy for the UFV Library 50 But one of its advantages is that it reflects almost real-time data – every time you refresh a report or dashboard, you are working with the most current data available.18 There is no need to download the data from one system and store it in spreadsheets or text files. BCA also claims to be able to analyze and visualize datasets uploaded to the system, but our instance of BCA does not yet support this. Delivered Reports BlueCloud Analytics currently delivers almost 50 reports, located in the Shared Reports folder; these reports are available to all users, and require no specialized knowledge – just open the report, choose from among the available options, and run the report. Figure 4 lists several of the reports located in the Catalog-Item folder. The value of these delivered reports is that they require no specialized knowledge about data analysis. The reports are pre-built and selected to address a wide range of needs. But even if a report is not exactly what you need, it’s easy to pop into the report builder, modify the report, and save it under a new name – in addition to the Shared Reports folder, each user has a folder called My Reports where they are able to save any report that they modify or create from scratch. Figure 4. BlueCloud Analytics: Delivered reports from the Catalog-Item folder 18 Data is refreshed daily, via an overnight process. Sabbatical Report: An Assessment Strategy for the UFV Library 51 Report Builder BCA also provides the ability to build reports from scratch. The basic building block of reports in BCA is the data cube. A data cube is a collection of related data points. The item cube, for example, contains all the data points associated with a specific item, including: barcode; call number; circulation notes; collection (i.e., location); created date; last activity, check-in, checkout, in-house use and inventory dates; last checkout user; library; lifetime checkouts, inhouse uses, and renewals; notes; reserve status; and type. Additionally, dates are available in multiple permutations: date, datetime, day (of month), day of week, hour, month, quarter, and year. Reports are presented in a tabular format. The most basic report places attributes on rows and columns, and introduces one or more measures. I took part in the training webinar on BCA in the spring of 2015, but that was too long ago. I needed a refresher, so I started by going through the self-paced online tutorial provided by SirsiDynix. I also downloaded the training guide and data reference guide to use as references. Then I started by building a report. Figure 5 shows the report builder. Data attributes are simply dragged into the rows and columns sections of the report template. In this report, item type is the row attribute, and library is the column attribute. There are two metrics: the number of items (by library and item type) and the percentage of the collection that that number represents. The second metric, percentage of items, is created by modifying the number of items metric. Figure 6 displays the resulting report. Figure 5. BlueCloud Analytics report builder Sabbatical Report: An Assessment Strategy for the UFV Library 52 Figure 6. Simple report in BlueCloud Analytics Filters provide a means of limiting the data that is displayed for a particular attribute. Prompts offer the person running the report the opportunity to choose from among the values specified by the prompt. Figure 7 shows a more complex report in the report builder. In the Report Filter section, there are two filters: library and item type. In the metrics, there are two calculated measures: turnover rate and Bonn’s Use Factor.19 Figure 7. Using filters and calculated metrics in the BlueCloud Analytics report builder 19 This particular report was inspired by similar reports identified in the data audit. Sabbatical Report: An Assessment Strategy for the UFV Library 53 This particular report applies the two filters as prompts prior to running the report; in this way, the user can choose which libraries and which item types to display in the report (see Figure 8). Figure 8. Report prompts in BlueCloud Analytics Figure 9 displays the resulting report. Figure 9. BlueCloud Analytics report with prompted filters applied. Users can also interact with reports after they have been run. Reports can be sorted on any column, and the user can also filter the report based on any attribute or metric. The user can also turn reports into a chart, although the charting tools are quite rudimentary. Dashboards A dashboard brings together a collection of resources into a single visual display; the goal is to be able see all the relevant data at a glance. Dashboards often provide an overview of the data. Dashboards in BCA can incorporate reports, as well as visualizations that are built from scratch. Like the reports, BCA delivers a number of pre-designed dashboards. These are located in the Analysis Docs folders inside the various reports folders. Unfortunately, I didn’t have the time to fully explore dashboards beyond the basic tutorial. Figure 10 shows one of the delivered dashboards, this one showing turnover of the collection by date range. Sabbatical Report: An Assessment Strategy for the UFV Library 54 Figure 10. BlueCloud Analytics delivered dashboard: Turnover by date range Conclusions BlueCloud Analytics finally offers us the ability to analyze all of the data in our ILS, not just those data points that Sirsi has decided to make available in the other systems we have. Once the remaining data (acquisitions and serials) has been added, I suspect that we’ll find many uses for BCA’s reports and dashboards. While the delivered reports are useful, the interface is clunky, and since it’s not possible to easily share the data outside the system, it’s unlikely that most librarians will use it. I’m hoping that the anticipated updates to the system will address some of these concerns, making it a more attractive option for librarians to use. It could eliminate a lot of the duplication of effort that librarians undergo when preparing collection analyses. I don’t expect that most of our librarians will undertake the time needed to learn to build reports and dashboards, at least not with the current interface. But with dedicated human resources for assessment activities, the creation of custom reports and dashboards could be centralized. Tableau Tableau is data visualization software that is widely used by organizations of all types, including academic libraries. At the Library Assessment Conference I attended in November 2016, there were numerous posters and papers referencing Tableau as a data visualization tool. Because of its ubiquity and widespread use, there is a large community of users, many of whom are willing to share solutions, ideas, and inspiration. Tableau maintains galleries on both their Sabbatical Report: An Assessment Strategy for the UFV Library 55 main web site (https://www.tableau.com/solutions/gallery) and their public server (https://public.tableau.com/en-us/s/gallery), and the workbooks, including the data sources, are usually available to download so that you can deconstruct the visualizations, which is an important tool for learning how to use Tableau. They also maintain the Tableau Community forum, where you can ask for help with a specific problem. And Tableau also offers training videos on many of Tableau’s features. Tableau is available for free, but it has paid versions as well. When I first started using Tableau, it was with the free version, called Tableau Public. Tableau Public has all of the features of paid versions of Tableau, except that you can only publish your visualizations (“vizzes”) to the Tableau Public server, and it limits the kinds of data sources you can use (Excel is one of them). Additionally, you must upload your data, disconnecting it from the original data source. If you make changes to the underlying data source, you must re-load the data into Tableau. For my sabbatical, the library purchased a license for Tableau Desktop, which allows me to share vizzes via the Tableau Public server or as workbooks that can be viewed using the free Tableau Reader (https://www.tableau.com/products/reader). I can also create persistent connections to a wider range of data sources, which means I don’t lose the connection to the original data source. While I used only Excel and text files, I have the option to a wider range of data sources, including web-based services such as Dropbox and Google Drive. However, in order to connect to databases (offering real-time data updates), we would need to upgrade to the professional edition. Tableau is a complex tool; getting started is not very difficult, but it requires persistence and time to realize its full potential. All of the visualizations referenced below are available on the Tableau Public server (see https://public.tableau.com/profile/colleen.bell.ufv#!/). Space Use Study In the Fall 2014, Winter 2015 and Fall 2015 semesters we conducted an observational study of seat occupancy in the Abbotsford and Chilliwack libraries. The purpose was two-fold for the Abbotsford campus: 1) look at differences in usage patterns prior to the opening of the Student Union Building (anticipated mid-semester in Winter 2015), and 2) look at patterns of seating choice on the first floor of the library in preparation for a renovation. The Chilliwack library chose to participate out of interest. In 2016, I used Tableau to analyze the space use study data, but my knowledge of Tableau was very rudimentary. Since I now had the time to spend learning Tableau, I started with the space Sabbatical Report: An Assessment Strategy for the UFV Library 56 use study data – because I had already created the basic visualizations, I could work on introducing more elegant elements into them. In Tableau, once you have connected to the data source, you start with a worksheet. In my previous stab at using Tableau, I created several worksheets of each visualization, each representing a particular slice of the data (e.g., by library or spatial view). The first visualization I created then was a heat map, so I began there again, but this time I wanted to present a single heat map that allowed the user to adjust the view using filters. This was a big step in the learning curve. Figure 11 illustrates the use of a powerful tool in Tableau: parameters. A parameter adds versatility and flexibility to your data visualizations by allowing the user to choose how she wants to slice the data. This particular visualization allows the user to choose the campus, the spatial view (location or type of seating), and the time period (month, weekday, or hour). This breakthrough took quite a bit of research and persistence. While information on the web on how to do use Tableau effectively is plentiful, if you don’t know what to call it, you may not find it. Google became my salvation, as did learning to Google less like a librarian, and more like a student, typing in full questions rather than just keywords, hoping I would land on a combination of words to unlock the information vault. Figure 11. Heat map in Tableau, showing use of parameters One problem I ran into was that in the original space use data, months and hours were coded numerically (e.g., January = 1; 1 PM = 13. I was able to use the date format to display the months and hours in the format I wanted them (e.g., January and 1 PM), but the weekday wasn’t part of the original data – I had to derive the weekday from the date, then format it as text (rather than a number). But when I added the parameter controls, I could no longer get the text representations of the month, weekday, and hour. Sabbatical Report: An Assessment Strategy for the UFV Library 57 I had come across the Tableau Community forums in my Google quests for answers related to Tableau, and it seemed that now was a good time to make use of them. I had never asked a question in an online forum, but I took a deep breath, signed in, and posted my first question (after carefully reading the guidelines, of course – see Figure 12): Figure 12. Asking a question in the Tableau Community forums It worked, because I had a solution in my email within just a few hours. I used the forums two more times while working on this data set; see Thread 1: Displaying Date Parts as Text Names in Appendix F: Tableau Community Forums for the complete thread. Following my previous analysis of the space use data, I had been in contact with a Tableau representative, who suggested I might also want to try mapping the space use data on a floor plan of the library.20 I was intrigued by the idea but didn’t really have the time then –now I did (see Figure 13). Accomplishing this involved several steps: 1) Find digital floor plans of the libraries. Fortunately, one of our former on-call librarians had worked on a project involved a virtual tour of the library, and we had floor plans from that time. While they weren’t entirely accurate, they worked. 2) Use image editing software to create a single file with all three floor plans. 3) Find the coordinates for the different seating areas on the maps that correspond to the locations and seating types from the study data. Mapping coordinates of a square or rectangle is easy, but the Abbotsford library is not square – it’s pie-shaped – and many 20 I attended the Library Assessment Conference in November 2016, and there were two posters during the poster sessions that also displayed heat map data in this way. The idea isn’t original, but seeing the posters inspired me to persist. Sabbatical Report: An Assessment Strategy for the UFV Library 58 of our seating areas follow the shape of the building or are irregular. Using Google and the Tableau forums, I found a web browser-based tool developed by Bryant Howell, an engineer who works for Tableau, that helped me with mapping the complex polygons needed; it’s discussed in a blog post here: https://tableauandbehold.com/2015/04/13/creating-custom-polygons-on-abackground-image/ 4) The mouse slips while you’re drawing one of the polygons. Repeat the last step. And again. And again. 5) Follow the instructions for mapping the polygon shapes onto the background image using data from Tableau. Find out that only some of the data is actually being mapped. 6) Contact the Tableau forums to find out why data is not mapping properly (see Thread 3: No heatmap data showing on polygons/background image in Appendix F: Tableau Community Forums). 7) Implement solution. 8) Success! As with all of my work in Tableau, each visualization represents moving farther along the path of the learning curve. Figure 13. Tableau: Heat map data mapped onto library floor plans Tableau offers a wide range of chart options but also makes recommendations on which charts might be most appropriate for the data you’re viewing. Changing the visualization is easy and inspires experimentation. One such experiment resulted in the bubble chart in Figure 14, which I find to be one of the more interesting visualizations. This particular chart shows which times Sabbatical Report: An Assessment Strategy for the UFV Library 59 and locations (or seating types) are most popular in each campus library. Each bubble corresponds to a location/seating type and hour in the day. The colours represent the various locations/seating types (as noted on the legend), and the size of the bubbles relate to its popularity – or, rather, the average occupancy. Figure 14. Bubble chart in Tableau Initially, I tried to display bubble charts for each library side by side (Figure 15), but the scale in the second chart, as compared the first, was not consistent. After several attempts to force the data into compliance, I did some reading about bubble charts. It turns out that bubble charts are internally consistent – the size of the bubbles are relative to other bubbles in the same chart – but they are not necessarily to scale. So it’s not possible to compare the magnitude of a measure. For this reason, bubble charts, while beautiful, have limited value in comparing data, such as comparing two different libraries. Sabbatical Report: An Assessment Strategy for the UFV Library 60 Figure 15. Tableau: Comparing data using bubble charts does not work As a data visualization tool, Tableau is extremely robust. However, it can’t do everything, as I discovered when exploring seating capacity and peak occupancy in the library. I had previously created the visualization shown in Figure 16, but now I wanted to flex my newly-developed Tableau muscles and introduce some elegance to the dashboard. The particular complexity in this dashboard is that I needed to explore the data by crosstabulating the time periods (e.g., by month and weekday, month and hour, or weekday and hour). I started out by creating a parameter, but quickly discovered that it wasn’t going to do what I needed it to. I turned once again to the Tableau Community forum (see Thread 2: Using a parameter to select multiple dimensions in Appendix F: Tableau Community Forums), where I learned that this was actually not possible in Tableau. I would have to use two parameters - one for the first time period, and a second for the cross-tabulated time period. I then looked into the ability to make the second parameter dependent on the first (i.e., if the first parameter selected was month, then the choices in the second parameter would be weekday or hour) and learned that I’m not the first one to be interested in this capability; Tableau is working on developing dynamic parameters (see https://community.tableau.com/thread/159001), but it is a feature that is not yet available. Sabbatical Report: An Assessment Strategy for the UFV Library 61 Figure 16. Tableau failure: using a single parameter to control two values So, this visualization has a slight (but not fatal) flaw – if the viewer selects the same time period, they will see not peaks and valleys as in the intended view, but a single value (Figure 17). Figure 17. Results from a Tableau failure Finally, not every visualization can tell the real story – that is up to the data analyst. One of the goals of the space use study was to get a sense of whether the opening of the Student Union Building (SUB) would have an impact on use of the library’s facilities. The first time I analyzed the data in response to this question was during my sabbatical. Figure 18 compares use of the space in the Abbotsford library during the Fall 2014 semester (prior to the SUB opening) with the Fall 2015 semester (the SUB’s official opening was September 2015). Looking at the data, one can see that there is a marked decline in use of the library spaces following the opening of the SUB in all but one category: study carrels. Interesting – but is that the real story? Sabbatical Report: An Assessment Strategy for the UFV Library 62 Figure 18. Tableau can’t always tell the real story The SUB contains a lot of casual and group seating, but not a lot of quieter individual study space. And anyone who has opened a new building, or deployed new furniture in a building, knows that “new” is highly attractive. I have witnessed this many times. So, it might be reasonable to assume that these two factors drew some students away from the library to the new Student Union Building. And that might be true. Our study had a number of limitations, of course. We didn’t count how many people were using the SUB. We didn’t interview students in the SUB to find out if they had previously used the library but now preferred the SUB (and why, which might have provided insight for our renovation project). We didn’t gather data beyond the initial semester when the SUB was open, which might have told us if it was just the attraction of the new, or if the changes in usage were longer term. We didn’t look at other factors that might have affected use of our spaces, such as:    Were there fewer sections of courses offered that traditionally drive students to the library for its group study spaces? Where there changes to assignments that usually brought students into the library? Was there an overall decline in the amount of time students were spending on the campus altogether (as one faculty member commented to me anecdotally)? Becoming proficient with Tableau is only partly about becoming knowledgeable about data visualization. Tableau is a powerful tool, but it can’t serve as a substitute for deep knowledge about effective data analysis and communication of your findings. Sabbatical Report: An Assessment Strategy for the UFV Library 63 Ultimately, the power of Tableau is as a visual communication tool. In the short amount of time I had for my sabbatical, I didn’t get around to playing with the Stories feature in Tableau, which allows you to create data stories, integrating text, data visualizations, and other visual effects. I think that could comprise another 4-month sabbatical in itself. LibQUAL+ Analysis The UFV Library has administered the LibQUAL+ survey five times between 2005 and 2013, but the data has never been fully analyzed, with the ability to compare results from survey to survey.21 I thought it might be helpful to analyze the data using Tableau, with a view to eventually making the resulting visualization available publicly on our web site. In addition to the results report for each time we’ve administered the survey, LibQUAL+ provides access to the raw data and accompanying codebook. Each row in the raw data file (see Figure 19) represents one survey response and has 218 columns of data, and represents one survey response. Data include:        Demographics (e.g., discipline, user group, age) Technical information (e.g., browser used, start and end date and time) Information about the validity (e.g., completed/incomplete, number of unanswered questions) Minimum, desired and perceived scores for each question (there are 22 core questions, plus up to five additional questions) Gap scores for each question (there are two scores: adequacy gap and superiority gap) General satisfaction scores Frequency of use for the library and Internet Figure 19. Excerpt from LibQUAL+ survey raw data file – 218 columns of data 21 This is one of the reasons we declined to participate in the survey in 2016. Sabbatical Report: An Assessment Strategy for the UFV Library 64 This makes it very easy to import the data file into software such as SPSS for statistical analysis22, but the format is wholly unsuitable for visualizing the data in Tableau. The data for Tableau needs to be reorganized so that each row in the file represents the demographic profile of the respondent as well as the values assigned to a single question. Although I developed a method for doing this fairly quickly, in hindsight I wish I’d been aware of the Tableau add-in for Excel (https://community.tableau.com/docs/DOC-10394), which might have reduced the amount of time I spent on this step even further. Because the raw data file includes data from surveys not completed and assigns numeric values (-99 or -1) for questions not answered and questions not asked (there is a “lite” version consisting of only 9 randomly selected core questions), the data needs to be cleaned before it can be analyzed, to ensure that it is valid and that that unanswered or unasked questions are not included in calculations of average value. Finally, much of the demographic data is coded, and the codes need to be translated. There are also concerns about maintaining anonymity, since some of the discipline data, when combined with user group or age, could lead to identifiable information. So that data needed to be aggregated (for example, the category “Library Staff” was rolled into the “Staff” category, and the data for department was aggregated by faculty instead). Figure 20. LibQUAL+ data cleaned and ready for Tableau – only 16 columns of data Once the data had been cleaned and formatted for Tableau (see Figure 20), it was time to start analyzing the data. 22 LibQUAL+ provides an SPSS syntax file for download with your raw data, as well as instructions for preparing your data file for SPSS. Sabbatical Report: An Assessment Strategy for the UFV Library 65 There are two primary types of charts (beyond the typical bar or column chart) used by LibQUAL+ in its data reports: radar charts and zone of tolerance charts. Neither is available in Tableau, but the zone of tolerance chart can be engineered from a column chart and a line chart, while the radar chart requires math23. I went with the zone of tolerance chart (Figure 21). Figure 21. Fooling Tableau into creating a zone of tolerance chart A zone of tolerance in LibQUAL+ has three values: the minimum value (bottom of the “box”), the desired value (top of the “box”), and the perceived value (the “dot”). Zones of tolerance are calculated for each question, as well as for each of the three dimensions measured in the survey: Affect of Service (e.g., customer service), Information Control (e.g., collections), and Library as Place (e.g., library spaces). I had figured out how to do this once before in Excel, so with just a little bit of work, I was able to figure it out in Tableau. There are three tricks: 23 And probably a little bit of voodoo; there are instructions available on the web (no chicken sacrifices required): https://www.tableau.com/about/blog/2015/7/use-radar-charts-comparedimensions-over-several-metrics-41592 Sabbatical Report: An Assessment Strategy for the UFV Library 66 1) layering the two columns (representing the minimum and desired scores) on top of each other, and making sure they’re in the right order (minimum, which should be smaller than desired, needs to be on top); 2) changing the color of the bar for minimum score to match the background (in this case, white) – I didn’t find this particularly intuitive; and 3) placing the perceived score on a secondary axis, using a line chart. The only downside to the “cheater” zone of tolerance chart is that I was unable to remove the outline around the rectangles in the zone of tolerance; it appears automatically when you hover over the minimum or desired scores. I needed to figure out was how to customize the tooltip, which appears when you hover your mouse over a data point on the visualization. Tableau creates an automatic tooltip for each data point that includes values for all individual measures that make up that specific data point. However, I needed the tooltip to show values that weren’t part of the automatic data. For example, I wanted the tooltip to show the minimum, desired, and perceived scores in the zone of tolerance, as well as the two calculated gap values (adequacy and superiority) that are part of the LibQUAL+ analysis. The LibQUAL+ dataset gave me the opportunity to experiment with dashboards. The dashboard in Figure 22 provides a high-level overview of the survey results for each year of the survey, and includes different data views. Figure 22. LibQUAL+ survey dashboard in Tableau Sabbatical Report: An Assessment Strategy for the UFV Library 67 There are two notable features on this dashboard: 1) Each chart was developed separately in its own worksheet, with its own filter controls (in this case, the year the survey was administered). 2) A single slider control on the dashboard controls the view of every chart. Change the survey year in the slider control, and the data changes for the entire view. This was another new piece of the puzzle for me. Departmental Allocations Dashboard Several years ago, a faculty in one of my liaison areas expressed interested in seeing if the materials she and her colleagues were requesting were actually being used. I discovered that Sirsi is not capable of providing that kind of report – not through the reports in Workflows, not through Director’s Station, and not through the API. And while BlueCloud Analytics may eventually offer this capacity, it does not do so now, because it contains no acquisitions data. At the time, I created a manual report, which took me about 20 hours. I ran a report of titles purchased over five years on the department fund, then manually looked up circulation figures24. This report has become a standard by which I measure reporting capabilities of a system – is it able to create a report that merges two different sets of data? And the answer should be yes, as long as there is a unique data point (such as an item ID) that can be matched in each set. While exploring acquisitions data one day in Director’s Station, I realized that the data includes the call number, and further that call numbers are unique - no two items in a library collection have the same call number.25 Since the circulation data also includes the call number, I now had a way to match the data and analyze the use of items purchased on any of our monographic funds. I downloaded two sets of data from Director’s Station (one a list of titles by fund, and the other a list of titles with circulation data). I save them as Excel files and imported them into Tableau, matching them on the call number field. The result is the dashboard shown in Figure 23. To visually depict which items had circulated (orange) and which had not (blue), I created a calculated field that looked at the total number of circulations for each item. In this 24 I could probably have delegated this task, but I wanted to see how much work would be involved, and it was a good project to have on hand for those slow times at the reference desk. 25 This not strictly true. When we buy a copy for both the Abbotsford and Chilliwack libraries, the call numbers will be the same in both libraries. However, this is relatively rare. Sabbatical Report: An Assessment Strategy for the UFV Library 68 visualization for Kinesiology, we can see that most of the items purchased and catalogued since 2011-12 (the past five years) have circulated, meaning that the library is making good selection decisions (presumably with faculty input), and that faculty are probably promoting use of the print collections (or using it themselves). Figure 23. Departmental allocations dashboard in Tableau I also chose to analyze items by LC class; this was again achieved through a calculated field on ranges of call numbers. I found it surprising and interesting to note that items from a number of LC classes were being purchased from a single departmental allocation. The example above, for Kinesiology, shows that selected items are classed in 12 different classes, including music and art. Sabbatical Report: An Assessment Strategy for the UFV Library 69 Using this dashboard as a model, it would be possible to depict other aspects of the budget, looking at, for example, the electronic resources budget against use26, or at the overall performance of the collections budget or the library budget as a whole. And, because Tableau dashboards can be made public, we can easily share this kind of data with our stakeholders, particularly those who make funding decisions. EZproxy Data Every time a user accesses one of our licensed e-resources, they pass through EZproxy, our proxy server. If they are off campus, they will need to authenticate through the proxy server to gain access to the resource. This activity can be recorded through the use of logs, and while we haven’t used this feature in the past, it can be a rich source of data to supplement that supplied through vendors. Most of our vendors provide statistics on resource use through COUNTER reports. COUNTER (https://www.projectcounter.org/) is a non-profit organization that has developed a code of practice to measure the use of electronic usage. Vendors who adopt the COUNTER code of practice in their statistical reports enable libraries to compare usage of e-resources across different databases and vendors. COUNTER reports, while incredibly useful, are limited to numbers, including full-text downloads (articles, books, book chapters), searches, turnaways, record and abstract views, and pages viewed or printed (e-books). EZproxy logs, on the other hand, include data about the context of the request, such as the IP address of the user’s computer, the username, the session ID (useful for tracking the path of a single session), the URI (the address of the resource requested), the referrer URL (the web site referring the user to a proxied resource), and the date and time of the request. Figure 24 is based on just four days of data from the EZproxy logs, between March 26 and March 29, although only a single day is displayed. What is most surprising is that four days of data was all I could fit into an Excel spreadsheet, which is one of the easiest ways for Tableau to connect to data. For this particular dataset, I was unable to use Excel – I had to use a text file instead. It turns out that Excel is limited to 1,000,000 rows of data in a single worksheet, and I could only fit one day of data (about 550,000 rows) into an Excel worksheet. I didn’t expect this much data for a single day. However, when you consider that an access request is generated each time a user clicks on 26 For e-books, for example, we can match on either call number or ISBN. E-journal use would be measured a little differently – we’d have to find a way to put together COUNTER data and database costs. Sabbatical Report: An Assessment Strategy for the UFV Library 70 a link on the screen (including paging through results), it makes sense that there would be a huge volume of data generated in the logs. This visualization displays a tree map – each “block” of data represents one IP address, and each rectangle within the block represents one hour in the day. The size of the rectangle and intensity of the colour represent how many accesses were made in that hour. The darkest red block in the tree map comes from the IP address 108.172.8.52, which is registered to Telus and located in Abbotsford. The size of the rectangle tells us that this is the highest use in a single hour (from 8 PM to 8:59 PM), and represents 1,385 access requests. The much narrower rectangle next to it represents 68 access requests for the hour beginning at 9 PM. We can also see that our users are coming from a large number of different IP addresses. What we can’t tell from this map is whether the more common IP addresses represent a single user or session, or whether it’s from multiple users or sessions.27 Figure 24. Tableau: EZproxy use by IP range and hour of day for March 29, 2017 27 The limited dataset I was working with did not include session or user data – I was still working on configuring the log files to capture that information. Sabbatical Report: An Assessment Strategy for the UFV Library 71 This was as far as I got in analyzing proxy server data. There are a number of issues that arise, from this data, however. The data in and of itself is not always meaningful. Collecting usernames will tell us whether it’s a student or employee accessing the data, but more meaningful would be knowing whether it’s an undergraduate student or a graduate student and what program they’re in, or whether it’s staff or faculty, and what department they are connected to. This entails connecting our data to other data systems within the university. And we would need to be clear about our reasons for doing so. Connecting library use to student achievement or retention is useful for communicating the value of library services. Connecting library use to information about departments or programs can help us determine where we need to be more effective at communicating the value of the library’s collections – an indication of where we should be directing our marketing efforts or strengthening faculty connections. Collecting IP addresses is only useful if we can connect them to an owner and/or geolocation. Fortunately, there are APIs (such as FreeGeoIP: https://freegeoip.net/?q=184.65.200.195) that can automate this process. One of the most important aspects of logging proxy server data is knowing where your users are going – what resources they are using, and how often. There are different permutations of this data available. When I set up the log files, I included the URI, which is the full address for each resource or page accessed. This seemed like it would be useful, but in retrospect, it would have been better to also log just the domain name of the host server. With both pieces of information, it would be possible to more easily count access requests by vendor or database, as well as determine what type of resource was being requested. Nonetheless, I believe it is worthwhile to continue collecting and analyzing EZproxy data, even if we only collect it periodically. I believe that the proxy server logs can provide a more contextualized view of e-resource usage. Conclusions Tableau is an incredibly powerful tool for both analyzing and communicating data about our library’s services, operations and facilities. It offers a wider range of chart types than the other tools, and is very flexible. Although the learning curve is steep, I believe it can help us in our efforts to make our data more visible to both internal and external stakeholders. Because of the steep learning curve, I expect that the development of visualizations and dashboards in Tableau will be centralized, perhaps within a data visualization team. Sabbatical Report: An Assessment Strategy for the UFV Library 72 Dedoose Dedoose is inexpensive, web-based software to manage, excerpt, code, and analyze data from qualitative and mixed methods projects. It is also available in a desktop version. The Library initiated a subscription in 2016. Dedoose provides training videos, but the interface is fairly easy to figure out, so I didn’t feel the need to view the videos before getting started. I did refer to them as I worked through the data, but only to learn about certain features, such as descriptors. LibQUAL+ Survey Comments For my sabbatical, I chose to analyze comments from the five years of LibQUAL+ surveys. Comments from most of the earlier surveys had been coded in Excel, but the data from the 2013 survey had yet to be analyzed. Additionally, the analysis in Excel was very simple and limited; Dedoose offers the ability to create a more comprehensive codebook and a more indepth analysis. As with the qualitative survey data, the comments can be downloaded and imported into Excel, which is the ideal format for importing data into Dedoose. The raw data includes some of the same demographic data as the quantitative data: user group, discipline, library (branch), age group, and sex. Comments can be focused on one aspect of the library, or on multiple aspects. The coding of this data is still in progress (there are more than 2,000 comments), but I managed to code enough of the data to be able to use all of the analysis tools Dedoose offers. The first step in the process is to create a project, then to load the data. Dedoose can accept several different forms of data, including spreadsheets, text and Word files, images, audio, PDFs, and videos. Dedoose can also import data from other analysis software, including SPSS and nViVO. Finally, you can start with a blank document and paste or type text into it (Figure 25). Sabbatical Report: An Assessment Strategy for the UFV Library 73 Figure 25. Dedoose can accept data from a variety of sources When importing from a spreadsheet, Dedoose offers suggestions on how to prepare your Excel file, including making sure that there are column headings and that the data is formatted appropriately: Change all demographic and other survey data column headers to appropriate labels that make sense for a descriptor variable – ex. for a question like ‘Please indicate your age group,’ you’ll want to change the column header to ‘Age Group.’ Then if the data are numeric, ex. 1=10-25 years, 2=26-40 years, 3=41-50 years, you’d want to change the 1s to ’10-25 years,’ the 2s to ’25-40 years,’ and so on – this is pretty easy with a search and replace in Excel. The Maximum length is 120 characters, but for readability we suggest less than 30 characters. Before uploading the data, I modified it so that the demographic information (descriptors) would correspond to the categories and values I used in the quantitative data file. Once the data is imported, the home screen (Figure 26) starts to take shape. Dedoose automatically detects descriptor fields and assigns initial codes based on column headings in the spreadsheet. But these can be easily modified on the Descriptors and Codes screens, respectively. Sabbatical Report: An Assessment Strategy for the UFV Library 74 Figure 26. Dedoose project home screen Next, I started coding the data. I started by creating three codes representing the dimensions measured by the survey (Affect of Service, Information Control, and Library as Place), as well as a code (“Attitude”) reflecting the general affect of the comment (e.g., positive, negative, mixed). I added additional codes, creating a hierarchical structure, as I worked through the data (Figure 27). Sabbatical Report: An Assessment Strategy for the UFV Library 75 Affect of Service - Librarians - Named Individual - Staff (General) - Technicians - Unnamed Individual Attitude - Mixed - Need/Request - Not satisfied/Negative - Satisfied/Positive Campus - Abbotsford - Chilliwack Frequency of Use Information Literacy Information Control - Access to Resources/Services - Books/Stacks - Borrowing Periods - Collections (General) - Collections (Specific) - Course Reserves - E-books - Fines - Interlibrary Loans - Journals - Research Databases - Textbooks - Videos/Media Services - AskAway/Electronic Reference Library as Place - Artwork - Availability of Equipment/Space - Computers/Computer Lab - Furniture Style/Type - Group Study - Group Study Rooms - Hours - Noise - Quiet Study - Smell/Odour - Temperature - Use of Space - Whiteboards Survey Method/Questions Figure 27. Code structure for LibQUAL+ comments in Dedoose Once the data has been coded, it’s time to analyze it. Dedoose provides a powerful suite of analysis tools. Not all of them will be relevant to every project, but there are several that are useful for this project. Several examples follow. Descriptor x Descriptor x Code This bar chart resembles a cross-tabulation displaying the frequency of excerpts within the descriptor fields and tagged with the selected code. In Figure 28, the two descriptor fields are User Group and Survey Year, and the code is Satisfied/Positive (a child code of Attitude). It allows us to compare the frequency of comments displaying a satisfied or positive attitude by respondents in each user group (e.g. lower-level undergraduates) across each year the survey was administered. By flipping the order of the two descriptors, we can easily change the view to show the attitudes of each user group in a particular survey year. In this chart we find, for example, that lower-level undergraduate students tended to indicate satisfaction or positive affect in their comments more frequently in 2013 than in previous years the survey was administered. Sabbatical Report: An Assessment Strategy for the UFV Library 76 Figure 28. Descriptor x descriptor x code chart in Dedoose Code Frequency x Descriptor Bubble Chart This bubble chart displays the frequency with which the codes are applied to comment excerpts. There are three codes: Y axis, X axis, and size. The colour of the bubbles is determined by the descriptor. Figure 29 applies Affect of Service to the Y axis, Information Control to the X axis, and Satisfied/Positive (Attitude) to the bubble size. These codes are then mapped to user group. From this we can see that staff comment more frequently on their positive affect toward the library than do faculty, but that all of their comments are related to customer service, while faculty tend to comment more often on collections than they do service. Sabbatical Report: An Assessment Strategy for the UFV Library 77 Figure 29. Code frequency x descriptor bubble chart in Dedoose Descriptor x Code Count Table Figure 30 is actually a heat map displaying the number of excerpts that intersect each code and descriptor. For example, upper level undergraduate students cite affect of service and satisfaction or positive affect in their comments at higher rates of frequency than they do other aspects of the library, as well as compared to other user groups. Figure 30. Descriptor x code count table in Dedoose Sabbatical Report: An Assessment Strategy for the UFV Library 78 Packed Code Cloud Figure 31 illustrates the packed code cloud, also referred to as a word cloud. While word clouds have been heavily overused as a visual communication tool, they still manage to convey a great deal of information with little effort. The larger the text, the more often that aspect of the library is mentioned in comments. Figure 31. Packed code cloud in Dedoose Code Co-Occurrence Table Figure 32 shows the frequency of comments where both of the intersection codes are applied. In this example, affect of service occurs in 17 comments or excerpts where the writer has also indicated satisfaction or a positive affect. Sabbatical Report: An Assessment Strategy for the UFV Library 79 Figure 32. Code co-occurrence table in Dedoose Conclusions Dedoose offers a powerful set of analytics tools for qualitative data. While the visualizations are not as pretty or polished as those produced by Tableau, it is much easier to use than the other systems discussed. Dedoose fills a gap not filled by the other software examined in this report. While the UFV Library does not currently collect much qualitative data, many of the presenters at the most recent Library Assessment Conference (November 2016) favoured a mixed methods or qualitative approach, using qualitative data to enrich and explain results from quantitative data. Dedoose is relatively inexpensive, especially as compared to other software products in the same category (e.g., nVIVO, Atlas.ti). Its web-based interface makes it possible to use anywhere there’s an Internet connection. LibInsight LibInsight28 is one of a suite of web-hosted software from Springshare, and is designed specifically for libraries – it can organize and consolidate most library statistics in one place, 28 For my sabbatical, I arranged a free, one-month trial of LibInsight. Sabbatical Report: An Assessment Strategy for the UFV Library 80 allowing libraries to analyze across datasets. LibInsight supports a wide range of datasets common to libraries, including:      E-book, database, and journal usage. Harvest COUNTER data directly from vendors supporting the SUSHI protocol or upload COUNTER reports yourself; analyze cost and usage data; run duplicates lists, and find out which resources are most and least used Circulation and acquisitions analytics. View usage and purchase trends, create custom report filters (e.g., patron type, item location or type), view circulation and purchases by classification, analyze year-over-year trends, and identify the most circulated titles Budget analytics. Track expenditures and revenues, view multi-year trends, create custom reports, and compare financial and circulation datasets to see how effective your selection decisions are Public services analytics. Import instruction, reference and other public services data to view trends and create custom reports Web site analytics. Harvest data directly from Google Analytics and LibGuides In short, anything a library counts can be uploaded (or manually entered) and analyzed. Data can also be analyzed across datasets, and you can create interactive, real-time dashboards that can be shared externally using Springshare’s widgets. The default home page (Figure 33) is immediately familiar to those using other Springshare products. From here you can select or manage a dataset, add or analyze data, create, edit or view a dashboard, and create widgets. Figure 33. LibInsight home page Sabbatical Report: An Assessment Strategy for the UFV Library 81 Public Services Statistics For more than 15 years, we have been compiling monthly statistics on our public services. Most of the data passes through at least 3 people in the library. It usually starts with a library technician, who tracks the data for the month, then sends it to the relevant technician-in charge, who compiles the data in a Word form and sends it along to the library’s administrative assistant, who finally transcribes the data into an annual spreadsheet. Over this time, very little has changed about the data we track. At one point, we did stop counting the number of government documents catalogued, but that was only because we were no longer receiving these documents through the depository service, so there was nothing to count. It’s not clear, however, exactly how some of this data is used. Every year we report financial, technical services and public services data to two organizations: the Council of Post-Secondary Library Directors (CPSLD), which compiles data for academic libraries in B.C., and the Council of Prairie and Pacific University Libraries (COPPUL), which compiles data for member libraries. The data reported to each organization is similar, and much of the service data comes from these statistics. But it’s not clear how some of the data, such as course reserves processed or media bookings requested, are actually used, if at all. Nonetheless, I wanted to see if LibInsight could help at least streamline the workflow, so that we didn’t need to have so many hands involved in recording the data. The first step was to prepare the data file for uploading. The public services statistics are stored in a spreadsheet, but it is not in a format that LibInsight can easily handle (Figure 34). Figure 34. Public services statistics in its original format LibInsight expects a more traditional format, with column headings and one “transaction” (e.g., monthly count) per line. This was a laborious process that took many hours. LibInsight also Sabbatical Report: An Assessment Strategy for the UFV Library 82 requires three extra fields: date (which must contain a value), “entered by” (which can be blank), and “internal notes” (which can also be blank) – see Figure 35. Figure 35. Public services data formatted for LibInsight upload The next steps involved creating the dataset, uploading the file and letting LibInsight analyze it for the fields, then reviewing the data structure, ensuring that the fields are correctly identified and formatted (Figure 36), and setting up pre-defined entries (Figure 37) that pre-select certain values and reduce the number of fields that actually need to be filled in, reducing the opportunity for errors. Figure 36. Editing the dataset fields in LibInsight Sabbatical Report: An Assessment Strategy for the UFV Library 83 Figure 37. Pre-defined entries in LibInsight make data entry easier Analyzing the data was the next step. Generating a report involves choosing a date range and applying any filters. From there, LibInsight provides a number of ways to analyze the data. Unfortunately, this particular dataset proved difficult to analyze. Ideally, each service area (e.g., instruction, circulation, media bookings, gate count) would have its own dataset, and a dashboard could pull them all together into one view. In spite of this setback, LibInsight would streamline the collection of our public services statistics and provide more detailed analysis than we are currently able to do with the data as it exists. A dashboard would be able to pull all of the data together into one place. E-journal Usage One of the types of data I find hardest to access when I need it, such as completing a collection analysis for a program review, is data on our e-resource usage, both e-journals and e-books. There is data available on our shared network drive, but it’s not always in the format I want and it’s rarely comprehensive. LibInsight offers a potential solution to this, because it can both store and analyze the data, and dashboards can be created and made available in an easy-to-findand-access location, such as a LibGuide. For this trial, I selected four vendors; for each of the selected vendors, I needed to have cost data and the information needed to set up a SUSHI record. I chose to import the JR1 COUNTER report data (number of full-text downloads) from January 2015 forward. Sabbatical Report: An Assessment Strategy for the UFV Library 84 The default report summarizes cost and usage data by platform (Figure 38 and Figure 39). Not surprisingly, of the four vendors (American Chemical Society, Ebscohost, Oxford Journals, and Springerlink), Ebscohost figures prominently: about 55% of the costs but 97% of the downloads, leading to a very low cost-per-use. SpringerLink, by contrast, comprises about 29% of the cost, but less than 2% of the downloads (a cost-per-use of $7.61). Figure 38. Analyzing database and journal cost and use data in LibInsight Figure 39. Comparing journal and database data by platform in LibInsight Monthly e-journal use over a two-year period (Figure 40) reveals peaks consistently appearing in March and November – not at all unexpected given the rhythms of the academic year. Sabbatical Report: An Assessment Strategy for the UFV Library 85 Figure 40. Monthly journal and database use over time in LibInsight When we view the same data on an annual basis, we can see year-over-year differences (Figure 41); overall use for these four vendor platforms dropped slightly in 2016 over 2015 figures. Figure 41. Yearly journal and database use over time in LibInsight We can view trends across all platforms or by single platform (Figure 42). Here we see that the number of downloads for Oxford Journals has increased slightly, but the cost in 2016 increased more dramatically, increasing the overall cost per download. Sabbatical Report: An Assessment Strategy for the UFV Library 86 Figure 42. Year-over-year trends on database and journal use in LibInsight One of the most valuable tools in this particular dataset is the ability to identify journal titles duplicated across platforms and associate this with use (Figure 43). LibInsight tells us that “there are 16,514 journals in 4 platforms. 14,996 are unique and 759 appear in more than one platform.” Figure 43. Summary of duplicate journal titles across platforms in LibInsight We can also generate a list of the duplicate titles showing use in each of the platforms (Figure 44) or a list of journals from a single platform with total downloads for the period indicated and a “D” indicating duplication (Figure 45). Sabbatical Report: An Assessment Strategy for the UFV Library 87 Figure 44. List of duplicate titles (all platforms) in LibInsight Figure 45. LibInsight: List of SpringerLink journals with number of downloads We can also view lists of journals with the highest (Figure 46) and lowest (Figure 47) use (e.g., number of downloads), across all platforms or within a single platform. Fifteen out of 60 journals (25%) from the American Chemical Society (ACS) have zero use (the inverse would be 75% of ACS journals have had at least one download). None of these journals are duplicated in other platforms. Figure 46. LibInsight: Journals with the highest number of downloads (all platforms) Sabbatical Report: An Assessment Strategy for the UFV Library 88 Figure 47. LibInsight: ACS Journals with zero use in a two-year period If data is available for individual databases in a platform, we can also view usage data by database (Figure 48)29. Not unexpectedly, Business Source Complete, promoted heavily to business students through instruction and our LibGuides, sits at the top of the list. What is most interesting about this list is that it reveals searches in databases to which we do not explicitly subscribe, such as Credo Reference and NewsBank. Figure 48. LibInsight: Database usage for Ebscohost Dashboards in LibInsight allow you share data publicly (Figure 49). When compared to other systems explored in this report, however, the dashboards function is quite limited. It offers only the most basic chart types (bar, column, line, pie), and dashboards must be created row by row, using pre-determined configurations (1 wide chart, 2 medium charts, 3 small charts, or chart plus text). And each chart can measure only one variable. 29 If we add cost data, we can also see cost-per-use data for each database. Since we subscribe to so many database from Ebsco, it would be worth the effort to add this data. Sabbatical Report: An Assessment Strategy for the UFV Library 89 Figure 49. Journal usage dashboard in LibInsight30 With the current capabilities of LibInsight, we would need to export the data to something like Tableau to get more sophisticated analysis of our usage and cost data. For example, comparing database cost to cost-per-use can give us a visual scenario for decisions about cutting resources – items that are high-cost and high cost-per-use would be natural targets to look at first. Conclusions LibInsight has several advantages in its support of assessment activities:     it is designed specifically to organize and consolidate library data; it uses the same interface as other Springshare products, making it immediately familiar, if not entirely intuitive; it offers automated harvesting of several different types of data, and the ability to upload data in Excel or CSV format; and the data analysis allows us to evaluate cost, trend, and use data (depending on the dataset), as well as to compare data across datasets. LibInsight, however, is limited in its ability to visualize data in meaningful ways, and its dashboards offer limited versatility. The annual cost is quite reasonable, but it does require a 30 This dashboard can be viewed online: https://ufv-demo.libinsight.com/public.php?id=2 Sabbatical Report: An Assessment Strategy for the UFV Library 90 large investment of time in developing each dataset, particularly since existing data is not always available in the format needed for uploading to the system. LibInsight needs further study in order to fully evaluate its potential. For this reason, the library will be initiating a one-year subscription this fall as a focused pilot project; the details of the pilot project have yet to be worked out, but the goals would be to identify one or two datasets that would be of widespread value to librarians and library staff, and to use as many features of LibInsight as are relevant to the selected dataset(s). Final Thoughts No one tool or system is going to be able to meet all of our needs when it comes to organizing, analyzing, visualizing and communicating library data. Each of the systems examined here has both advantages and limitations, and it is evident that we will have to continue to choose the tools that best meets our needs at any given time. This lends weight to the conclusion that we need to invest dedicated resources to assessment – it’s unlikely that most of our librarians and technicians are going to be able to (or want to) invest the time needed to become competent in whichever systems we choose to use. Rather, it would make sense to concentrate the expertise in a small team that could harvest the data and build the dashboards and visualizations31 to help them analyze the data however they need it. A PROPOSED ASSESSMENT STRATEGY “A goal without a plan is just a wish.” (attributed to Antoine de Saint-Exupéry) This strategy is not an assessment plan – it would be premature to offer a plan in the absence of a strategic plan. What it is, rather, is a plan of action for establishing a library assessment program and beginning to create a culture of assessment within the UFV Library. It builds upon work that has already been done (or at least initiated), and suggests steps we can take to aid in the strategic and assessment planning to come. 1. First and foremost, we need to create the organizational capacity to engage in assessment. We can’t sustain a model where we carry out assessment “of the sides of 31 That’s not to say, however, that others would not be welcome or encouraged to analyse data on their own, but hopefully it would be in consultation with the “data team.” Sabbatical Report: An Assessment Strategy for the UFV Library 91 2. 3. 4. 5. 6. 7. 8. 9. our desks.” A dedicated assessment librarian, with library technician support, will greatly aid in the development of this capacity. Develop a plan for managing our data; this involves several pieces to it: a. Complete the data audit, including interviews with librarians and staff about their data practices and needs. b. Develop a comprehensive list of library metrics already in use. c. Consolidate our library data. d. Develop and/or implement a system or index to our data and metrics. e. Explore free and/or commercial systems for harvesting and storing datasets. Create a centralized location for library dashboards that could provide easy access to data that is commonly requested or used. Establish a schedule for harvesting library data used in the dashboards. Identify outcomes for library services. This would enable us to identify existing services, identify potential gaps, and identify useful metrics (existing or new). Complete analysis of LibQUAL+ data and comments to provide a longitudinal view of user perceptions of the library. Include data in the library dashboards. Identify and implement new assessment projects; two that come immediately to mind are: a. Post-occupancy study of the renovations in the Abbotsford library. In addition to gathering feedback on the renovated spaces, it would double as a follow-up to the original space use study and explore the question, “At what point do we consider the library full?” 32 It would use a mixed-methods approach. b. Conduct usability testing on the new web site. While the overall design of the site will not change, it’s likely that we may identify a need to re-organize content or placement of elements.33 Develop an annual report for the Library.34 Integrate assessment planning into the strategic planning process. Begin by identifying exemplary library assessment plans from comparator institutions. Ensure that assessment is a consideration in any discussion around planning, projects, and services. 32 Chilliwack Library would also be invited to participate in this aspect of the study. 33 This would be considered an agile design process – gathering input from live users and making incremental changes based on the findings of those tests. 34 An annual report could also serve as a source of data for institutional accountability documents, as well, including the annual UFV Factbook and accountability report. Sabbatical Report: An Assessment Strategy for the UFV Library 92 This is an ambitious agenda, but it can lay some important groundwork for future assessment activities. As I said at the beginning, my underlying premise is that assessment is both good and necessary. I hope that I have managed to present a compelling case for assessment in the UFV Library. Sabbatical Report: An Assessment Strategy for the UFV Library 93 REFERENCES Albert, A. B. (2017). Building brand love and gaining the advocacy you crave by communicating your library's value. Journal of Library & Information Services in Distance Learning, 11, 237-250. http://dx.doi.org/10.1080/1533290X.2016.1193413 Alekson, T. (2017, April). Keeping assessment in sight. Program presented at the BC Library Conference, Vancouver, BC. Andrews, C., Wright, S. E., & Raskin, H. (2016). Library learning spaces: investigating libraries and investing in student feedback. Journal of Library Administration, 56, 647-672. http://dx.doi.org/10.1080/01930826.2015.1105556 Association of American Colleges and Universities. (2009). Information literacy VALUE rubric. Retrieved from https://www.aacu.org/value/rubrics/information-literacy Association of College and Research Libraries. (1998). Task Force on Library Outcomes Assessment report. Retrieved from http://www.ala.org/acrl/publications/whitepapers/taskforceacademic Association of College and Research Libraries. (2001). Information literacy competency standards for higher education. Retrieved from http://www.ala.org/acrl/standards/informationliteracycompetency Association of College and Research Libraries. (2017). ACRL proficiencies for assessment librarians and coordinators. Retrieved from http://www.ala.org/acrl/standards/assessment_proficiencies Beile, P., Choudhury, K., & Wang, M. C. (2017). Hidden treasure on the road to Xanadu: What connecting library service usage data to unique student IDs can reveal. Journal of Library Administration, 57, 151-173. http://dx.doi.org/10.1080/01930826.2016.1235899 Boyce, C. M. (2015). Secret shopping as user experience assessment tool. Public Services Quarterly, 11, 237-253. http://dx.doi.org/10.1080/15228959.2015.1084903 Canadian Association of Research Libraries. (2010). Core competencies for 21st century CARL librarians. Retrieved from http://www.carl-abrc.ca/doc/core_comp_profile-e.pdf Chang-FitzGibbon, K., & Wang, J. (2017). In the spotlight: Technical services professionals in library-wide assessment. Technical Services Quarterly, 34, 157-173. http://dx.doi.org/10.1080/07317131.2017.1286845 Sabbatical Report: An Assessment Strategy for the UFV Library 94 Choo, C. W. (2001). Environmental scanning as information seeking and organizational learning. Information Research, 7(1). Retrieved from http://www.informationr.net/ir/71/paper112.html Connaway, L. S., & Radford, M. L. (2014). Keynote address. In OCLC & Boston Library Consortium (Producers), Getting the right fit: Tailoring assessment strategies for your library [Video webinar]. Retrieved from https://www.oclc.org/en/events/2014/CI_Brandeis_April_2014.html Cranney, J., & McDonald, F. (2012). Evidence-based learning. In N.M. Seel (Ed.), Encyclopedia of the Sciences of Learning (pp. 1185-1188). New York: Springer. Crawford, G. A. (2015). The academic library and student retention and graduation: An exploratory study. portal: Libraries and the Academy, 15, 41–57. Daniels, E. (2010). Using a targeted rubric to deepen direct assessment of college students’ abilities to evaluate the credibility of sources. College & Undergraduate Libraries, 17, 3143. http://dx.doi.org/10.1080/10691310903584767 de la Mano, M., & Creaser, C. (2016). The impact of the Balanced Scorecard in libraries: From performance measurement to strategic management. Journal of Librarianship and Information Science, 48, 191-208. http://dx.doi.org/10.1177/0961000614558078 Dole, W. V. (2013). What's all this I hear about core competencies for library planning and assessment? Journal of Library Administration, 53, 472-481, http://dx.doi.org/10.1080/01930826.2013.882201 Dugan, R. E., Hernon, P., & Nitecki, D. A. (2009). Viewing library metrics from different perspetives: Inputs, outputs, and outcomes. Santa Barbara, CA: Libraries Unlimited. Durrant, S., & Chase, S. (2016). What do we collect and why? Conducting a self-study to improve data collection practices. In S. Baughman, S. Hiller, K. Monroe, & A. Pappalardo (Eds.). Proceedings of the 2016 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment (pp. 558-562). Washington, DC: Association of Research Libraries. Education Advisory Board. (2011). Redefining the academic library: Managing the migration to digital services. Retrieved from http://www.eab.com/ Emmons, M., & Wilkinson, F. C. (2011). The academic library impact on student persistence. College and Research Libraries, 72, 128-149. http://dx.doi.org/10.5860/crl-74r1 Sabbatical Report: An Assessment Strategy for the UFV Library 95 Fagan, J. C., Mandernach, M., Nelson, C. S., Paulo, J. R., & Saunders, G. (2012). Usability test results for a discovery tool in an academic library. Information Technology & Libraries, 31, 83-112. Farkas, M. G., Hinchliffe, L. J., & Houk, A. H. (2015). Bridges and barriers: Factors influencing a culture of assessment in academic libraries. College & Research Libraries, 76, 150-169. http://dx.doi.org/10.5860/crl.76.2.150 Foster, N. F., & Gibbons, S. (Eds.). (2007). Studying students: The undergraduate research project at the University of Rochester. Chicago, IL: Association of Research Libraries. Gabriella. (1998, January). Henry Rollins: A heavy weight with heavy thoughts. NY Rock. Gola, C. H., Ke, I., Creelman, K. M., & Vaillancourt, S. P. (2014). Developing an information literacy assessment rubric: A case study of collaboration, process, and outcomes. Communications in Information Literacy, 8, 131-144. http://www.comminfolit.org/ Haddow, G. (2013). Academic library use and student retention: A quantitative analysis. Library & Information Science Research, 35, 127-136. http://dx.doi.org/10.1016/j.lisr.2012.12.002 Hernon, P., Dugan, R. E., & Nitecki, D. A. (2011). Engaging in evaluation and assessment research. Santa Barbara, CA: Libraries Unlimited. Hinchliffe, L. J. (2016). Sensemaking for decisionmaking. In S. Baughman, S. Hiller, K. Monroe, & A. Pappalardo (Eds.). Proceedings of the 2016 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment (pp. 12-14). Washington, DC: Association of Research Libraries. History of higher education in the United States. (2017, April 9). In Wikipedia, the Free Encyclopedia. Retrieved from https://en.wikipedia.org/wiki/History_of_higher_education_in_the_United_States Hovious, A. (2015, January 22). Part 2: ACRL alignments – current and proposed [Blog post]. Retrieved from https://designerlibrarian.wordpress.com/2015/01/22/part-2-acrlalignments-current-and-proposed/ Jackson, B. (2015). University rankings: How well do they measure library service quality? portal: Libraries and the Academy, 15, 315-330. http://dx.doi.org/10.1353/pla.2015.0026 Sabbatical Report: An Assessment Strategy for the UFV Library 96 Jastram, I., Leebaw, D., & Tompkins, H. (2014). Situating information literacy within the curriculum: Using a rubric to shape a program. portal: Libraries and the Academy, 14, 165-186. http://dx.doi.org/10.1353/pla.2014.0011 Kaufman, P., & Watstein, S. B. (2008). Library value (return on Investment, ROI) and the challenge of placing a value on public services. Reference Services Review, 36, 226-231. http://dx.doi.org/10.1108/00907320810895314 Lakos, A., & Phipps, S. (2004). Creating a culture of assessment: A catalyst for organizational change. portal: Libraries and the Academy, 4, 345-361. http://dx.doi.org/10.1353/pla.2004.0052 Leberecht, T. (2016, October 16). Make your strategy more agile. Harvard Business Review. Retrieved from https://hbr.org/2016/10/make-your-strategy-more-agile Lewellen, R., & Plum, T. (2016). Assessment of e-resource usage at University of Massachusetts Amherst: A MINES for Libraries® study using Tableau for visualization and analysis. Research Library Issues, 288, 5-20. MacDonald, P. (2013). Organizational climate assessment and improvement planning. In K. Blessinger & P. Hrycaj (Eds.). Workplace Culture in Academic Libraries (pp. 59-78). Oxford, UK: Chandos Publishing. http://dx.doi.org/10.1016/B978-1-84334-702-6.500046 Marquez, J., & Downey, A. (2015). Service design: Toward a holistic assessment of the library. PNLA Quarterly, 80(1), 37-47. McAyeal, G. (2014). A culture of assessment: Five mindsets. College & Research Libraries News, 75, 311-312. http://dx.doi.org/10.5860/crln.75.6.9139 Mezick, E. M. (2007). Return on investment: Libraries and student retention. Journal of Academic Librarianship, 33, 561-566. http://dx.doi.org/10.1016/j.acalib.2007.05.002 Middle States Commission on Higher Education. (2006). Characteristics of excellence in higher education: Requirements of affiliation and standards for excellence. Retrieved from http://www.msche.org/publications/CHX-2011-WEB.PDF Montgomery, S. E. (2014). library space assessment: User learning behaviors in the library. The Journal of Academic Librarianship, 40, 70-75. http://dx.doi.org/10.1016/j.acalib.2013.11.003 Sabbatical Report: An Assessment Strategy for the UFV Library 97 Murray, A. (2015). Academic libraries and high-impact practices for student retention: Library deans’ perspective. portal: Libraries and the Academy, 15, 471-487. http://dx.doi.org/10.1353/pla.2015.0027 Oakleaf, M. (2009). The information literacy instruction assessment cycle: A guide for increasing student learning and improving librarian instructional skills. Journal of Documentation, 65, 539-560. http://dx.doi.org/10.1108/00220410910970249 Oakleaf, M. (2010). Value of academic libraries: A comprehensive research review and report. Association of College & Research Libraries. Retrieved from http://www.acrl.ala.org/value/?page_id=21 Ogier, A., Hall, M., Bailey, A., & Stovall, C. (2014). Data management inside the library: Assessing electronic resources data using the data asset framework methodology. Journal of Electronic Resources Librarianship, 26, 101-113. http://dx.doi.org/10.1080/1941126X.2014.910406 Oliveira, S. M. (2017). The academic library’s role in student retention: A review of the literature. Library Review, 66, 310-329. http://dx.doi.org/10.1108/LR-12-2016-0102/ Passoneau, S., & Erickson, S. (2014). Core competencies for assessment in libraries: A review and analysis of job postings. Library Leadership & Management, 28(4), 1-19. Retrieved from https://journals.tdl.org/llm/index.php/llm/article/view/7080 Phipps, S., Franklin, B., & Sharma, S. (2013). Striving for excellence: Organizational climate matters. Evidence Based Library and Information Practice, 8(2), 22-35. http://dx.doi.org/10.18438/B8V028 Pritchard, S. M. (1996). Determining quality in academic libraries. Library Trends, 44, 572-594. Retrieved from https://www.ideals.illinois.edu/handle/2142/999 Savage, D., Piotrowski, P., & Massengale, L. (2017). Academic librarians engage with assessment methods and tools. portal: Libraries and the Academy, 17, 403-417. http://dx.doi.org/10.1353/pla.2017.0025 Schmidt, A. (2012). Stepping out of the library. Library Journal, 137(4), 26. Schmidt, A. (2015). UX means you. Library Journal, 140(16), 21. Shepstone, C., & Currie, L. (2008). Transforming the academic library: Creating an organizational culture that fosters staff success. The Journal of Academic Librarianship, 34, 358-368. http://dx.doi.org/10.1016/j.acalib.2008.05.008 Sabbatical Report: An Assessment Strategy for the UFV Library 98 Soria, K. M., Fransen, J., & Nackerud, S. (2013). Library use and undergraduate student outcomes: New evidence for students’ retention and academic succoss. portal: Libraries and the Academy, 13, 147-164. http://dx.doi.org/10.1353/pla.2013.0010 Stvilia, B., & Gibradze, L. (2017). Examining undergraduate students' priorities for academic library services and social media communication. The Journal of Academic Librarianship, 43, 257-262. http://dx.doi.org/10.1016/j.acalib.2017.02.013 Taylor, M., & Heath, F. (2012). Assessment and continuous planning: The key to transformation at the University of Texas libraries. Journal of Library Administration, 52, 424-435. http://dx.doi.org/10.1080/01930826.2012.700798 Thompson, G. B. (2002). Information literacy accreditation mandates: What they mean for faculty and librarians. Library Trends, 51, 218-241. Tobias, C., & Blair, A. (2015). Listen to what you cannot hear, observe what you cannot see: An introduction to evidence-based methods for evaluating and enhancing the user experience in distance library services. Journal of Library & Information Services in Distance Learning, 9, 148-156. http://dx.doi.org/10.1080/1533290X.2014.946354 Turbow, D. J., & Evener, J. (2016). Norming a VALUE rubric to assess graduate information literacy skills. Journal of the Medical Library Association, 104, 209-214. http://dx.doi.org/10.3163/1536-5050.104.3.005 Walsh, A. (2009). Information literacy assessment: Where do we start? Journal of Librarianship and Information Science, 41, 19-28. http://dx.doi.org/10.1177/0961000608099896 Warren, R., & Epp, C. (2016). Library space and signage kindness audits: What does your user see? Partnership: The Canadian Journal of Library and Information Practice and Research, 11(1). http://dx.doi.org/10.21083/partnership.v11i1.3602 Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.). Alexandria, VA: ACSD. Willson, G., & Angell, K. (2017). Mapping the Association of College and Research Libraries information literacy framework and nursing professional standards onto an assessment rubric. Journal of the Medical Library Association, 105, 150-154. http://dx.doi.org/10.5195/jmla.2017.39 Wilson, P. (2014). Planning collections in the UFV Library: Faculty sabbatical leave report, September to December 2014. Abbotsford, BC: The Author. Sabbatical Report: An Assessment Strategy for the UFV Library 99 Zaugg, H., McKeen, Q., Hill, B., & Black, B. (2017). Conducting and Using an Academic Library Data Inventory. Technical Services Quarterly, 34, 1-12. http://dx.doi.org/10.1080/07317131.2017.1238192 Sabbatical Report: An Assessment Strategy for the UFV Library 100 APPENDICES Sabbatical Report: An Assessment Strategy for the UFV Library 101 Appendix A: Library Strategies for the Education Plan Note: Many of the UFV Library’s current, established activities support the Education Plan goals. The statements and strategies noted here describe activities that are new or will be significantly advanced. We will… Currently, we… Prioritize Learning 1.1 Ensure that all UFV Everywhere learners can access information competency support at point of need (time and place). Commit to Flexibility and Responsiveness 2.1 Are responsive to emerging issues, needs and opportunities, able to address them in a timely manner. By… Evidenced by… Providing accessible and relevant online tutorials. Increased number and use of online tutorials. Assessments of the use and effectiveness of tutorials. Integrating library instruction, resources and services into the Blackboard environment. All Blackboard courses have a library presence including a librarian contact. Revising current library instruction program to create capacity to support targeted programs and courses. Library instruction supports information competency development within programs. Establishing clear, relevant roles and Revised and up-to-date position scope of responsibilities for all descriptions. library employees. Employing a flexible model of decision-making. Library staff have increased autonomy in making decisions, appropriate to their level of authority. Making timely decisions on the basis Decisions are timely. Decisions are of evidence. supported by evidence. 2.2 Offer services and resources designed for Implementing responsive website design. Sabbatical Report: An Assessment Strategy for the UFV Library Library website and other in-house online tools are device-neutral. 102 We will… Currently, we… By… inclusivity and responsiveness. Applying universal design principles Library spaces are accessible to all. to any re-designed/renovated library spaces. 2.3 Support the learning needs of all members of the UFV community. Providing workshops/sessions on relevant and timely topics (such as open educational resources, research data management, altmetrics, copyright, scholarly publishing options, etc.). Increased number of workshops/sessions directed to faculty and staff. 2.4 Make library spaces accessible. Increasing open hours. Library spaces accessible longer hours during periods of student need. Offering flexible learning spaces. Increased number of bookable group study rooms in both campus libraries. “Makerspace” established in Abbotsford Library. Spaces available where faculty can meet informally with students. Establishing an institutional repository. HarvestIR is populated with faculty and student work, digitized publications and documents, and is recognized and well used across UFV programs. Including programming space in library renovations/re-designs. Abbotsford and Chilliwack libraries both have appropriate space that can be used to host events. Collaborate 3.1 Provide physical and across Boundaries virtual spaces where all members of the UFV community come together. Sabbatical Report: An Assessment Strategy for the UFV Library Evidenced by… 103 We will… Currently, we… By… Evidenced by… 3.2 Bring the UFV community together in support of UFV goals. Collaborating with other UFV departments to host/sponsor events. Increased number of programs held in library. Increased number of program attendees. Collaborating with other UFV departments to create and publicize displays. Increased number of collaborative library displays. Develop Local and 4.1 Support the Global Citizenship Indigenization of UFV. 4.2 Ensure that library spaces and services are welcoming to Indigenous students, employees and community members. Ensuring the library collection Increased number of resources from continues to be strong in Indigenous Indigenous authors. content. Ensuring the library collection continues to be strong in resources relating to Indigenizing the Academy. Increased number of resources on topic. Supporting program and course developers in Indigenizing their curriculum. Promotion of library guides and resources on topic. Implementing applicable recommendations from the Truth and Reconciliation Commission. In process. The UFV Library is a member of a number of Canadian academic library consortia currently exploring this. Working with the Indigenous Affairs and the Indigenous Student Centre to make UFV campus libraries welcoming and accessible to Indigenous students. Local Stó:lō culture reflected in art and images in campus libraries. Sabbatical Report: An Assessment Strategy for the UFV Library 104 We will… Integrate Experiential Learning Currently, we… By… 4.3 Support UFV’s international students to be successful. Collaborating with UFV International Continued or expanded participation in to develop targeted programs and international student orientation events. services. Workshops for library staff on working with international students. 4.4 Promote UFV’s role in the development of the communities of the Fraser Valley. Establishing a UFV Archives (physical and virtual) that preserves the history and legacy of UFV as well as our communities. 5.1 Provide experiential learning opportunities in UFV libraries Collaborating with the LibIT program Increased numbers of practicum students. to provide increased opportunities Positive feedback from students on the for program students in the library. practicum experience. Offering the Library as client for course projects (e.g. Communications) Evidenced by… A physical UFV Archives which is managed by a professional archivist. Archival institutional (UFV) records are accessioned and preserved. Agreements in place with community archives/memory institutions ensure complementary mandates and collections. Increased number of completed projects. Providing opportunities for students Increased number of library opportunities to gain co-curricular credits. for students to use on their co-curricular records. Sabbatical Report: An Assessment Strategy for the UFV Library 105 Appendix B: Site Visit Interview Questions 1) What is your role at the library? (In other words, what are the major activities that define your job?) 2) Describe the assessment piece of your job. What percentage of your job is focused on assessment activities? How does it relate to your other responsibilities (if at all)? 3) Does your library have a strategic &/or assessment plan? Would you be willing to share? 4) Do you feel that there is a culture of assessment in your library? Why or why not? If yes, how did your library get there? 5) Describe the types of assessment that your library engages in. Who is involved? How often do these activities occur? 6) Are there assessment activities for which you are not directly responsible? If so, what are they, and who takes the lead? 7) Do you have any internal (to the library) committees, teams, ad hoc groups whose purpose centres on assessment? How often do they meet? 8) Does your library produce regular reports of assessment activities? In other words, do you communicate with stakeholders about your assessment activities? If so, what are they, and who are they for? Are they available online, and if so, where would I find them? If not, would you be willing to share one or two examples? 9) How do you manage data used in your assessment activities (e.g., inventory, storage, use)? 10) What systems (i.e., software, technology, management) do you use in your assessment practices? Can I see examples of what you’ve done? 11) What institutional resources do you use/have available to you in your assessment activities? 12) Do you partner with any departments outside the library in your assessment activities? What is the nature of the relationship? 13) Does your library/institution participate in regional or national surveys that include library assessment (e.g., LibQUAL+, NSSE)? If so, which ones, and how often? How has the data informed your library’s operations? 14) What is the most exciting/intriguing/affirming assessment opportunity you have engaged in to date? What makes it so? Sabbatical Report: An Assessment Strategy for the UFV Library 106 Appendix C: Value of the UFV Library, 2010-2011 Total cost of library operations ...................... $2,842,246.00 Total library value135.................................... $13,894,831.50 Return on Investment36 ............................................... $4.89 Expenditures per student/faculty ............................ $380.27 Value per student/faculty37 .................................. $1,859.01 Library Value Calculations Physical space (study space, computers)38 units unit value total value 425,712 $25.00 $2,660,700.00 113,245 $26.12 $2,957,959.40 1,005 $26.12 $26,250.60 495,566 $31.50 $7,805,164.50 Use of physical volumes39 … UFV owned … Borrowed for UFV users from other libraries Articles accessed online40 … UFV subscriptions 35 See “Library Value Calculations” for details on how this was calculated 36 Amount of value returned to UFV for every $1.00 invested 37 The UFV Library considers students and faculty as its primary community; in 2010-11 UFV had 7,059 FTE students, and 415.3 FTE faculty (total = 7,474.3 FTE) 38 This represents the total number of visitors to the UFV Library in 2010-11, for a variety of purposes: using computers, printing & photocopying, group meeting and study space, individual study space, use of services, use of collections. The unit value estimates the hourly cost of renting space that would provide some or all of these services/facilities, and the cost is calculated based on an average visit of 15 minutes (which may actually be quite low) 39 Assumes that access to a volume by borrowing it from library is worth a user 50% of the cost of purchasing the book; calculation uses 50% of the average Amazon.com unit order cost (price + shipping) for library-like content (but it’s important to note that many of the items we buy for the library are not available from Amazon.com, and many are more costly than this price would reflect) 40 Assumes that a commercial pay-per-view charge fairly describes the value of accessing a scholarly article; unit cost based on per-per-view change for ScienceDirect in the absence of a license ($31.50); calculation uses 5)% of Science Direct charge to account for price differences among a wide range of disciplines and pay-per-view sources; data is not available for all of our full-text article sources, so the actual numbers are higher Sabbatical Report: An Assessment Strategy for the UFV Library 107 … Borrowed for UFV users from other libraries 1,596 $31.50 $25,137.00 … Point-of-need interactions 24,770 $15.00 $371,550.00 … UFV students using AskAway 1,133 $15.00 $16,995.00 … In-depth consultations (hours)42 21 $75.00 $1,537.50 … Classroom (hours) 311 $75.00 $11,662.50 … Classroom Prep & Follow-up (hours) 446 $75.00 $16,725.00 … Non-classroom (hours)44 46 $25.00 $575.00 Reference and research consultations41 Instruction43 Total $13,894,831.50 41 Reference interactions build research skills and contribute to faculty research results; these figures represent the value of reference questions answered for our users in person, as well as by phone and email; the unit value is widely used and adapted in libraries, and was developed by the Massachusetts Library Association 42 Number of hours of individual research consultations provided by appointment; assumes a fair representation of the value of a research consultation, as based on charges levied by other libraries for requests coming from outside users (i.e., those not affiliated with the institution) 43 Assumes that the value received by students in classroom instruction is comparable to that received in individual consultations; total value is calculated at 50% of the unit value to account for multiple students per interaction 44 Assumes that non-classroom instruction (i.e., tours) are of much less value than instruction that enables students to successfully complete their assignments Sabbatical Report: An Assessment Strategy for the UFV Library 108 Appendix D: Data Inventory Note: Some of the data collected in the data inventory has been omitted here for space purposes. I have kept a copy of the original spreadsheet with the data. Wiki Page Title Last Update Description Linked Documents Abby head count reading week 2009-02-19 Feb 09 ADMIN Library Statistics 2008-06-13 TSstats Annual (2003+)- PDF; Database Use Stats 2007/08 PDF; Collection Item Holdings Stats Annual (from 2003+); Public Services 2007-08.xls; Circulation_5_years.xls; PSstats_cumulative_overview.xls; Psstats_janapril07_abb.xls; Psstats_janapril07_chill.xls; Psstats_septdec07_abb_rev.xls; Psstats_septdec07_chill.xls ADMIN UFV Library Retention Document Fall 2008 2008-12-12 Instruction; Reference; Facilities; Customer Service; Policies; Collections; ILL Annual Statistics 2015-04-15 Data collection procedures Chwk head count reading week 2009-03-04 Feb 09 Sabbatical Report: An Assessment Strategy for the UFV Library 109 Page Title Last Update Description Linked Documents CM Collection Analysis Reports 2009-05-06 Data collection procedures IT Sirsi Scheduled Reports 2008-03-20 List of reports to March 2007 Library Staff Retreat 2014: Service and Space Planning 2014-05-06 Agenda for the retreat Reference statistics definitions 2011-11-29 TS Reports 2013-10-10 LibQUAL 2013: UFV results; UFV Library usage & activity trends; Pre-retreat survey results Data collection procedures Shared Network Drive File Name Type Last Saved Description 2012 2013 COPPUL Statistics PDF 2015-Mar-18 COPPUL Statistics 2012-13 (expenditures, establishment and collections; emerging trends; use, facilities, and services); COPPUL Salaries 2013-14 2012 Usage reports Folder Database usage statistics by vendor 2013 Usage reports Folder Database usage statistics by vendor 2013-2014 COPPUL Statistics PDF 2016-Sep-02 COPPUL Statistics 2013-14 (expenditures, establishment and collections; emerging services; use, facilities, and services); COPPUL Salaries 2014-15 Sabbatical Report: An Assessment Strategy for the UFV Library 110 File Name Type Last Saved Description 2013-2014 COPPUL Statistics Excel 2016-Aug-01 Data for COPPUL Statistics 2013-14, Salaries 2014-15 (includes definitions) 2013-2014 COPPUL Statistics - revised Excel 2016-Sep-02 Data for COPPUL Statistics 2013-14, Salaries 2014-15 (includes definitions) 2014_15 Usage reports Folder 2014FallTerm-1 Excel 2015_16 Usage reports Folder 2016-20 Library Ed Plan Strategies PDF 2016-Aug-26 2016-20 Education Plan outcomes for the library (includes specific metrics) Abby Reading Week Head count Feb 09 Excel 2012-Feb-03 Hourly headcount (19:00 through 21:00) in Abbotsford library for the period February 16-19, 2009 ABBY STATS Folder Monthly public services stats (circulation, holds, reserves, ILL, and media) for Abbotsford library, for fiscal years 2009-10 through 201617 Annual Statistics Folder Annual public services stats (circulation, holds, reserves, ILL, and media) for Abbotsford and Chilliwack libraries, for fiscal years 200203 through 2016-17 AskAway Question Types Spring 2014 PDF 2014-Jun-26 AskAway Statistics 2013 Summer Excel 2013-Sep-13 AskAway statistics by institution for summer 2013 semester; broken down by month AskAway Statistics 2013 Winter Excel 2013-Sep-13 AskAway statistics by institution for winter 2013 semester; broken down by month CARL ILL Stats Excel 2013-Oct-16 Database usage statistics by vendor 2015-Jan-19 Data for AskAway service by institution (hours per week; number of sessions: handled by us, handled by others, via qwidget, sessions with other institutions' learners, total sessions handled per institution, sessions handled per hour) Database usage statistics by vendor Sabbatical Report: An Assessment Strategy for the UFV Library Pie chart showing percentage of questions by type of question Lending and borrowing data for the period April 1 2012 to March 31 2013 111 File Name Type Last Saved CARL-ABRC_Stats_2012-13_Final_NoSans_Comment_17Feb-Fev2015 PDF 2015-Feb-18 CARL Statistics 2012-2013 (expenditures and collections; emerging services; use, faculties and services) (Note: UFV not part of CARL, but could be useful for benchmarking, peer comparisons) CEP Library facts Folder Space (square metres); gate count; number of seats; data on design, construction and move; collections; personnel as of official opening of Chilliwack library on September 7, 2012 CEP Planning Folder 2016-Jul-28 CEP space requirements, inventory, and shelving requirements CHILLIWACK STATS Folder Monthy public services stats (circulation, reserves, government publications processed, entrance counts, etc.) for Chilliwack library, for years 2007 through 2017 Circulation by Department 2011_2012 Excel 2012-May-17 Call number ranges for circulation counts by department; circulation (total minus library use transactions) Circulation by Department 2011_2012_mastercopy Excel 2012-Apr-30 Call number ranges for circulation counts by department; circulation (total minus library use transactions) Circulation by Item Home Location 2015_16 Excel 2016-Apr-06 Item count, circulation count and turnover rate by item home location Circulation by Item Type 2015_16 Excel 2016-Apr-06 Item count, circulation count, turnover rate, % of collection, % of circulation, and Bonn's use factor by Item type CollectionItemStats 2003-2009 Excel 2011-Jun-09 CollectionItemStats Annual Excel 2015-Mar-27 Number of items by library (ABB, CHILL, WEBSITE) and item type (data available by fiscal year for 2003-04 through 2013-14) CollectionItemStats Annual2013_2014 PDF 2014-Apr-23 Number of items by library (ABB, CHILL, WEBSITE) and item type for 2013-14 fiscal year CollectionItemStats Annual2014_15 Excel 2015-Apr-10 Number of items by library (ABB, CHILL, WEBSITE) and item type (data available by fiscal year for 2003-04 through 2014-15) Sabbatical Report: An Assessment Strategy for the UFV Library Description Number of items by library (ABB, CHILL, WEBSITE) and item type (data available by fiscal year for 2003-04 through 2009-10) 112 File Name Type Last Saved Description CollectionItemStats Annual2014_15 PDF 2015-Apr-10 Number of items by library (ABB, CHILL, WEBSITE) and item type for 2014-15 fiscal year CollectionItemStats Annual2015_16 Excel 2016-Apr-06 Number of items by library (ABB, CHILL, WEBSITE) and item type CollectionItemStats resorted for Kim 2012 Excel 2013-Nov-25 Number of items by library (ABB, CHILL, WEBSITE) and item type for 2011-12 fiscal year (data retotaled/sorted for reporting?) CollectionItemStats resorted for Patti 2013 Excel 2013-Nov-25 Number of items by library (ABB, CHILL, WEBSITE) and item type for 2012-13 fiscal year (data retotaled/sorted for reporting?) COPPUL Billing Folder coppullans Excel 2016-Dec-13 Statistics on lending/document delivery to COPPUL member libraries for the period April 2015 to March 2016 coppulrecd Excel 2016-Dec-13 Statistics on books/articles received from COPPUL member libraries for the period October 2014 to March 2016 Copyright Questions Folder Record of copyright questions from 2014 and 2016 - some files are password-protected CPSLD Final 2014-2015 PDF 2016-Jul-22 CPSLD Statistics Report 2015 - 2016 PDF 2016-Nov-14 CPSLD Statistics Report 2015-16 CPSLD Stats 2012-13 FINAL Excel 2014-May-26 Data for CPSLD Statistics Report 2012-13 CPSLD Stats 2012-13 final Digital PDF 2014-May-26 CPSLD Statistics Report 2012-13 CPSLD Stats 2013-14 Final PDF 2014-Nov-17 CPSLD Statistics Report 2013-14 CPSLDStats2008-2009FINAL Excel 2014-May-26 Data for CPSLD Statistics Report 2013-14 Database_Usage 2011-2012_enhanced Excel 2012-Jun-22 Database_Usage 2012 Excel 2013-Aug-28 Database search and full-text statistics for 2012 Database_Usage 2013_2014 Excel 2015-May-20 License end date, pricing, cost per use, full-text views (JR1 and JR1a), record views, abstract views, visits, page views, DB1 searches, DB1 sessions, playbacks, BR2 section requests for 2013-14 fiscal year COPPUL billing reconciliation for the period October 2014 to September 2015 Sabbatical Report: An Assessment Strategy for the UFV Library CPSLD Statistics Report 2014-15 (institutional data; Database search and full-text statistics for fiscal years 2008-09 through 2011-12 113 File Name Type Last Saved Description Database_Usage 2014_2015 Excel 2016-Jun-08 License end date, pricing, cost per use, full-text views (JR1 and JR1a), record views, abstract views, visits, page views, DB1 searches, DB1 sessions, playbacks, BR2 section requests for 2014-15 fiscal year Database_Usage 2015_2016 Excel 2016-Aug-11 License end date, pricing, cost per use, full-text views (JR1 and JR1a), record views, abstract views, visits, page views, DB1 searches, DB1 sessions, playbacks, BR2 section requests for 2015-16 fiscal year Database_Usage_2009_10_enhanced Excel 2012-Apr-10 Database search and full-text statistics for fiscal years 2007-08 through 2009-10 DatabaseUse Stats07_08 PDF 2008-May-07 Database search and full-text view statistics for fiscal year 2007-08 donations2006-07 Word 2007-Mar-27 List of items donated by year E-BOOK Item Statistics minus EBL_DDA Word 2015-Apr-08 Sirsi Report on Item Statistics where category 1 is not EBL_DDA and item type is E-BOOK Education Program Folder Slides and presentations from copyright workshops ER databases A-Z (2012-2013) with Faculty Structure and Subjects Diane's version March 2013 Excel 2013-Aug-28 Database price, search and full-text views by database, consortium, faculty, discipline, and type of database (EDS-compatible, abstract & index, aggregator, publisher package, primary source) for fiscal year 2012-13 ereserve lists Folder 2017-Feb-10 Reserve readings lists by course 2012-2017 Faculty department visits Folder Final 2014-2015 CPSLD Stats Excel 2016-Jul-22 Data for CPSLD Statistics Report 2014-15 Final 2014-2015 CPSLD Stats with UFV added Excel 2016-Jul-29 Revised data for CPSLD Statistics Report 2014-15 FTE Loads Excel 2016-May-24 Data on FTE instruction loads 2011/12 to 2015/16 Letters to Retired Adjunct Faculty and Visiting Scholar Folder LibQual Folder 2012-Dec-03 Results from LibQUAL surveys 2005, 2006, 2007, 2010, 2013 Library Web Site Inventory Excel Data for Library 2025 strategic plan collected during faculty department visits Letters to adjunct faculty, emeritus faculty, honorary faculty, visiting scholars (possible metrics?) 2011-Sep-09 Inventory of library web site Sabbatical Report: An Assessment Strategy for the UFV Library 114 File Name Type Last Saved Description MISSION STATS Folder Number of holdings by item type and location Excel 2016-Jan-28 Number of items (total titles, shadowed titles, total call numbers, shadowed call numbers, total copies, shadowed copies) by home location and item type Number of holdings by item type and location Text 2016-Jan-28 Sirsi report on item statistics by home location Number of Titles Catalogued Word 2014-Dec-05 Number of titles catalogued for period January to June and July to December for years 2011 through 2014 Open Access Serials in CUFTS Excel Orientations [term] Excel Semester by semester data on scheduled instruction Orientations Data Excel 2016-May-13 Instruction data 2002/03 to 2015/16; includes analysis of data on students, sections, and hours, including top 5 courses Orientations Data Tableau Excel 2016-Jun-29 Instruction data formatted for use in Tableau 2002/03 to 2015/16 Part time estimate Librarians Fall 2013 Excel 2013-Oct-22 Budget estimates for part-time librarian coverage for fall 2013 semester Part time estimate Librarians Summer 2013 Excel 2013-Aug-26 Budget estimates for part-time librarian coverage for summer 2013 semester Part time estimate Librarians Winter 2013 Excel 2013-Apr-24 Budget estimates for part-time librarian coverage for winter 2013 semester part time librarians budget May 2010 - March 2011 Excel 2013-Jun-17 Patrontypestats Annual Excel 2010-May-04 Data on patrons by type (data for fiscal years 2006-07 through 200809) PD Reports Folder Reports from PD opportunities (possible metric?) Presentations Folder Slides and presentations from open access workshops Monthy public services stats (circulation, reserves, media, reference questions, entrance counts, repairs, Midwest selection cards, location transfer, community users, group study room use) for Mission library, for the period February 2012 to March 2013 2014-Apr-15 Data on open access collections in CUFTS for fiscal year 2010-11 Sabbatical Report: An Assessment Strategy for the UFV Library Budget estimates for part-time librarian coverage May 2010 - March 2011 115 File Name Type Last Saved Description Program Reviews and New Program Proposals Folder Ref Desk Schedules Folder 2017-Mar-04 Reference desk staffing schedules Summer 2012 to Winter 2014 (summary of hours by librarian) Reference Desk Statistics Folder Repair Stats -Abby Excel Saturday Entrance Count Folder Saturday entrance counts for the period Winter 2013 through Winter 2017 Schedules Folder Service desk schedules for PS and casual staff back to 2012 - there's potential for operational metrics here about staffing levels, but not sure what Semester End Reports Folder 2017-Jan-25 Circulation statistics for reserve items by semester 2011 to 2016 (Sirsi report) Statistics Folder 2013-Dec-19 Data on number of instruction classes by semester, 2005-2014 (for use in Public Services Statistics) Student input Folder Data for Library 2025 strategic plan collected from student events TSstats Annual 2003-2007 Excel 2008-May-05 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2003-04 through 2008-09 TSstats Annual 2008 PDF 2009-Sep-17 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2003-04 through 2008-09 TSstats Annual 2009 PDF 2010-May-03 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2003-04 through 2009-10 TSstats Annual 2010_2011 PDF 2011-Apr-18 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2003-04 through 2010-11 Collection analyses for program reviews, accreditations, new programs, and more Monthly reference service statistics (number of transactions) by type (directional, basic, in-depth) and mode (in person, AskAway, email, appointment) for the period January 2005 through April 2013 2016-Dec-21 Number of items repaired for calendar years 2013 through 2016 (note: data missing for period July 2013 through August 2014) Sabbatical Report: An Assessment Strategy for the UFV Library 116 File Name Type Last Saved TSstats Annual2011_2012 PDF 2012-Apr-02 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2004-05 through 2011-12 TSstats Annual2011_2012 (2) PDF 2012-Apr-10 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2004-05 through 2011-12 TSstats Annual2011.2012 Excel 2012-Jul-12 TSstats Annual2012_2013 Excel 2014-Apr-11 Data on collections budget, acquisitions and cataloguing activity for fiscal year 2012-13 TSstats Annual2013_2014 PDF 2014-Apr-24 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2005-06 through 2013-14 TSstats Annual2013_2014 Patti's version to test Excel 2014-Apr-24 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2005-06 through 2013-14 TSstats Annual2014_2015 PDF 2015-Apr-13 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2005-06 through 2014-15 TSstats Annual2014_2015 Excel 2016-Mar-23 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2005-06 through 2014-15 TSstats Annual2015_2016 Excel 2016-Apr-07 Data on collections budget, acquisitions and cataloguing activity for fiscal years 2005-06 through 2015-16 Usage Statistics Folder Sabbatical Report: An Assessment Strategy for the UFV Library Description Collection item statistics by item type for fiscal years 2003-04 through 2011-12 Monthly statistics on JSTOR DDA usage from December 1, 2015 forward 117 Appendix E: UFV Library Metrics This list was developed out of the documents identified in Appendix D: Data Inventory. While I can’t attest to the comprehensiveness of the list, it provides a starting point for exploring metrics as they relate to assessment activities. Category Library Metric Collections Average price paid per item (departments and LibGen) Collections Average price paid per item (print reference, including standing orders) Collections Bonn's use factor of library materials by item type Collections Cost of database by budget code and faculty Collections Cost of database by discipline Collections Cost of database by faculty Collections Cost of database per student FTE Collections Cost of database searches per student FTE Collections Cost of full-text views per student FTE Collections Cost per use by database Brief Description/Definition Reporting Bonn's Use Factor is the % of circulation divided by the % of collection, showing relative over and under use (Index is 1.00). For example, books are 53% of the collection, but 70% of the circulation, so the score is 1.39 (showing above average use). Item type includes types that have circulated or are eligible to circulate. Calculated: price of database / use = cost per use; "use" is defined in different ways (full-text, abstract, visit, search, etc.) Sabbatical Report: An Assessment Strategy for the UFV Library 118 Category Library Metric Collections Estimated value of items donated and added to collection Collections Number of active periodical subscriptions (direct) Collections Number of active periodical subscriptions (Ebsco) Collections Number of authority records added Collections Number of copies/items ordered Collections Number of copies/items paid Collections Number of donated items added to collection Collections Number of electronic documents (free, including government publications) catalogued Collections Number of full-text views by database Collections Number of government publication volumes catalogued Collections Number of invoices created Collections Number of items catalogued, by item type Collections Number of items catalogued, by library Collections Number of items donated to library Collections Number of items in Abbotsford library by item type Collections Number of items in Chilliwack library by item type Brief Description/Definition Sabbatical Report: An Assessment Strategy for the UFV Library Reporting 119 Category Library Metric Brief Description/Definition Collections Number of items labeled WEBSITE by item type Collections Number of library items by item home Home locations: ABCIRCDESK, CHCIRCDESK, DICTIONARY, location CURRICCULUM, ELL, VIDEO, PBSPINNER, STACKS, FOREIGN, REFERENCE, PERIODICAL, TAPE-CBNET Collections Number of library items by item type Item type includes types that have circulated or are eligible to circulate. Collections Number of loans of library materials by program or department Total minus in-house uses Collections Number of print serial issues checked in Collections Number of received items in backlog on last day in March Collections Number of searches by database Collections Number of serial claims processed in Ebsconet Collections Number of standing orders subscriptions (copies) Collections Number of times library items have circulated by home location Circulation count includes charge item, charge reserve, renew item, renew reserve, use item. Home locations: ABCIRCDESK, CHCIRCDESK, DICTIONARY, CURRICCULUM, ELL, VIDEO, PBSPINNER, STACKS, FOREIGN, REFERENCE, PERIODICAL, TAPE-CBNET Collections Number of times library items have circulated by item type Circulation count includes charge item, charge reserve, renew item, renew reserve, use item. Item type includes types that have circulated or are eligible to circulate. Sabbatical Report: An Assessment Strategy for the UFV Library Reporting 120 Category Library Metric Brief Description/Definition Collections Number of titles in open access collections in CUFTS Collections Number of vendors Collections Number print serial issues claimed Collections Percentage of circulation count by item type Item type includes types that have circulated or are eligible to circulate. Collections Percentage of collection by item type Item type includes types that have circulated or are eligible to circulate. Collections Total $ paid in fiscal year for collections Collections Total encumbrances rolled over at end of fiscal year Collections Total free balance rolled over at end of fiscal year Collections Turnover rate of library items by home location Turnover rate = circulation count / item count. Home locations: ABCIRCDESK, CHCIRCDESK, DICTIONARY, CURRICCULUM, ELL, VIDEO, PBSPINNER, STACKS, FOREIGN, REFERENCE, PERIODICAL, TAPE-CBNET Collections Turnover rate of library items by item type Turnover rate = circulation count / item count. Item type includes types that have circulated or are eligible to circulate. Instruction Number of classes taught By librarian, year, semester, course Instruction Number of hours spent teaching By librarian, year, semester, course Instruction Number of students in library instruction sessions By librarian, year, semester, course Interlibrary Loan Number of books/articles received from COPPUL member libraries Sabbatical Report: An Assessment Strategy for the UFV Library Reporting 121 Category Library Metric Interlibrary Loan Number of borrowing requests received Interlibrary Loan Number of borrowing requests sent Interlibrary Loan Number of copies received Interlibrary Loan Number of copies supplied Interlibrary Loan Number of items supplied to COPPUL member libraries Interlibrary Loan Number of lending requests filled Interlibrary Loan Number of lending requests received Interlibrary Loan Number of originals received Interlibrary Loan Number of originals supplied Interlibrary Loan Total number of videos/DVDs borrowed Interlibrary Loan Total number of videos/DVDs loaned Professional Development Number &/or % of library staff attending PD events Professional Development Number &/or % of library staff sharing notes from PD events Public Services Entrance count Public Services Number media bookings supplied Public Services Number of basic reference transactions Public Services Number of bills sent Public Services Number of CILS items (requests?) processed Brief Description/Definition Reporting Reported by campus Reported by campus Sabbatical Report: An Assessment Strategy for the UFV Library 122 Category Library Metric Brief Description/Definition Public Services Number of community computer users Reported by campus Public Services Number of directional questions asked Public Services Number of group study room bookings Reported by campus Public Services Number of holds placed Reported by campus Public Services Number of in depth reference transactions in person Public Services Number of individual reference appointments provided Public Services Number of instruction classes taught Public Services Number of interlibrary loan items received Public Services Number of interlibrary loan items requested Public Services Number of interlibrary loan items sent Public Services Number of interlibrary loan materials circulated Public Services Number of items repaired Public Services Number of items weeded Public Services Number of library materials circulated Sirsi report Public Services Number of media bookings requested Public Services Number of overdue notices sent Public Services Number of reference transactions by email Reporting Reported by campus Sabbatical Report: An Assessment Strategy for the UFV Library 123 Category Library Metric Brief Description/Definition Reporting Public Services Number of reference transactions on AskAway Public Services Number of reserves processed Public Services Number of users in library Public Services Number or library materials used inhouse (in-library use items) Public Services Number or library materials used inhouse (use items) Retired Faculty and Staff Number of retired staff with honorary status using the library Retired Faculty and Staff Number of retired staff with library privileges (honorary) Retired Faculty and Staff Number of retired faculty with library privileges (honorary or emeritus status) Retired Faculty and Staff Number retired faculty with honorary or emeritus status using the library Users Number of patrons by type Patron types as defined in Sirsi Archives and Special Collections Are archives and special collections managed by the library? Y/N COPPUL Archives and Special Collections Are university records included in the collections recorded at 1.2 Y/N COPPUL Archives and Special Collections Expenditures related to these collections ($) COPPUL Archives and Special Collections Manuscripts and archives (m) COPPUL Collections Use Number of initial loans COPPUL Reported by campus Sabbatical Report: An Assessment Strategy for the UFV Library 124 Category Library Metric Brief Description/Definition Digital Collections # of articles COPPUL Digital Collections # of other items COPPUL Digital Collections # of theses and dissertations COPPUL Digital Collections Size of digital files (TB) COPPUL Digital Collections Total # of items in institutional repository COPPUL Digital Collections Total number of digital objects COPPUL Document Delivery Traffic Borrowing: Total number of requests sent (filled and unfilled) COPPUL Document Delivery Traffic Lending: Total number of requests received (filled and unfilled) COPPUL Document Delivery Traffic Number of copies received COPPUL Document Delivery Traffic Number of copies sent COPPUL Document Delivery Traffic Number of originals received COPPUL Document Delivery Traffic Number of originals sent COPPUL Document Delivery Traffic Total number of requested filled by other institutions COPPUL Document Delivery Traffic Total number of requests received from other institutions filled COPPUL E-Publishing # of grants awarded COPPUL E-Publishing % funds dedicated to open access publishing COPPUL Sabbatical Report: An Assessment Strategy for the UFV Library Reporting 125 Category Library Metric Brief Description/Definition Reporting E-Publishing Does your library manage an Author's Fees fund? Y/N COPPUL E-Publishing Does your library store faculty research data? Y/N COPPUL E-Publishing Total amount of funds awarded COPPUL Electronic Resources Number of searches (queries) in databases or services COPPUL Electronic Resources Number of sessions (logins) to databases or services COPPUL Electronic Resources Number of successful full-text article requests COPPUL General Titles held (all formats) COPPUL Library Instruction and Number of library presentations to Facilities groups COPPUL Library Instruction and Number of seats Facilities COPPUL Library Instruction and Number of total participants in group Facilities sessions reported in 3.1 COPPUL Library Instruction and Total number of reference Facilities transactions COPPUL Library Instruction and Turnstile count Facilities COPPUL Library Materials Expenditures Collection support COPPUL Library Materials Expenditures One time resource purchases COPPUL Sabbatical Report: An Assessment Strategy for the UFV Library 126 Category Library Metric Brief Description/Definition Library Materials Expenditures Ongoing resource purchases COPPUL Library Materials Expenditures Total library materials COPPUL Local Characteristics Benefits are included in expenditures for salaries and wages Y/N COPPUL Local Characteristics Law library statistics are included Y/N COPPUL Local Characteristics List of all libraries included COPPUL Local Characteristics Medicals library statistics are included Y/N COPPUL Local Characteristics Number of graduate FTE COPPUL Local Characteristics Number undergraduate FTE COPPUL Local Characteristics Total enrolment FTE COPPUL Other Expenditures Fringe benefits COPPUL Other Expenditures Other operating expenditures COPPUL Other Expenditures Total library expenditures COPPUL Part Emerging Trends in Research Services Reporting COPPUL Part Salaries Report data for each employee (salary on July 1; category of the position, years of professional experience; years of professional experience in the reporting institution) Part Use, Facilities and Service COPPUL COPPUL Personnel Casual staff FTE COPPUL Personnel Librarians FTE COPPUL Personnel Other professionals FTE COPPUL Personnel Support staff FTE COPPUL Sabbatical Report: An Assessment Strategy for the UFV Library 127 Category Library Metric Brief Description/Definition Personnel Total professionals FTE COPPUL Personnel Total staff FTE COPPUL Salaries and Wages Expenditures Casual staff COPPUL Salaries and Wages Expenditures Professional staff COPPUL Salaries and Wages Expenditures Support staff COPPUL Salaries and Wages Expenditures Total staffing expenditures COPPUL Collections, Electronic Electronic monographs Count the titles that would be monographs if issued in print format, i.e, non-serial publications of any length issued in electronic format instead of or in addition to, print format. Include books owned or leased by the library. Government publications are included as are free monographs on the Web catalogued in the OPAC or specifically linked to the library’s web site Collections, Electronic Electronic serial titles Report the total number of unique electronic serial titles CPSLD that you currently acquire and to which you provide access; include both purchased and non-purchased titles; do not include duplicate counts of serial titles; report each title once, regardless of how many subscriptions or means of access you provide for that title – i.e. if a title is accessible through multiple databases, count it only once; include titles from aggregated packages; electronic serials acquired as part of a bundle or an aggregated package should be counted at the title level, even if they are not Sabbatical Report: An Assessment Strategy for the UFV Library Reporting CPSLD 128 Category Library Metric Brief Description/Definition Reporting catalogued, as long as the title is made accessible directly by the library (e.g. through a finding aid) Collections, Electronic Streaming media Count streaming videos or other media listed in the library CPSLD catalogue or linked to the library’s web site, whether purchased, leased or free on the Web Collections, Electronic Total electronic titles in collection Collections, Physical Back issues periodicals Include journals, magazines, and newspapers received in CPSLD print, microform, or CD formats; count volumes if they are known, otherwise 1 year = 1 volume; include annual index volumes Collections, Physical Monographs A volume is a physical unit of any printed or processed CPSLD work contained in one binding, encasement or other clear distinction, which has been catalogued as part of the collection and given an individual barcode; include titles in microform or CD (not individual cards of fiche except when 1 card = 1 title); include annuals; exclude periodicals; use explanatory notes for any unusual inclusions (e.g. documents, technical reports, individually catalogued maps) Collections, Physical Other audio formats Count all physical sound recordings (e.g. LP records, CPSLD cassette tapes, compact disks); count items intended to be used together as one unit (e.g. opera on 2 CDs = 1 unit); if two or more media are included (e.g. print and cassette tape), count as a single unit all items to be used in conjunction with each other Collections, Physical Other video formats Count all physical visual formats (e.g. slides, snapshots); CPSLD do not count individual slides unless they do not form part of a set (i.e. 1 slide set = 1 unit); if two or more media are CPSLD Sabbatical Report: An Assessment Strategy for the UFV Library 129 Category Library Metric Brief Description/Definition Reporting included (e.g. print & slides), count as a single unit all items meant to be used in conjunction with each other Collections, Physical Total current print subscriptions Count titles of journals, magazines, and newspapers CPSLD currently received in print, microform, or CD formats (e.g. Canadian Newsdisc = 8 titles); exclude annuals (counted in a) above); include gift subscriptions and those being received on exchange; include departmental subscriptions only if they are accessible to the college community (i.e. listed in library catalogue and available for use); count duplicate subscriptions (i.e. if the library subscribes to two copies of a title count 2); include Statistics Canada periodicals if they are treated like a periodical; exclude subscriptions to electronic periodical indexes and abstracts Collections, Physical Total volumes in library collection CPSLD Collections, Physical Videos & films Count all physical video & film formats; count physical items (e.g. 2 film reels = 2 units, series of 24 videos = 24 units) Computing Infrastructure Number of public workstations For each campus library, count all public workstations that CPSLD have Internet access (these are also counted in 9b above as part of the seat count) Facilities & Hours Total hours open per week (September - April) For each campus with library personnel count total operating hours per week; add figures for each library to provide total for g Facilities & Hours Total library area in square metres For each campus with library personnel provide library CPSLD area in square meters ( to convert from square ft. to square meters multiply sq. ft. by .0920, e.g. 30,000 sq. ft. x .0929 = 2787 sq. meters); include space for books and Sabbatical Report: An Assessment Strategy for the UFV Library CPSLD CPSLD 130 Category Library Metric Brief Description/Definition Reporting other library materials, space for storage of AV equipment if control of this equipment is the library's responsibility, library classrooms, study stations, seminar and study rooms, and workspace for library personnel; exclude areas used solely for janitorial, custodial and mechanical storage or services, lobbies, vestibules, building corridors, and other general access areas; add figures for each library to provide total for e Facilities & Hours Total number of seats For each campus with library personnel count all study CPSLD spaces for library users; include seats at computer workstations, equipment carrels, etc. exclude seats in staff areas, offices, meeting rooms, and other areas not normally occupied by users of library materials; add figures for each library to provide total for f Facilities & Hours Total reference hours per week (September - April) For each campus with library personnel count total hours of reference service provided; add figures for each library to provide total for h CPSLD Institutional Budget Base ministry grant As outlined in the grant allocation sent by the ministry to each institution in March (usually) for the next fiscal year CPSLD Institutional Budget Other operating revenues as reported Include all items that are considered in calculating your CPSLD from Audited Financial Statement for institution's operating budget; exclude institutional capital institution (i.e. new building funds, upgrading of present buildings, roads etc.); use explanatory notes to provide details on any exceptions or variations from the norm Institutional Budget Total institutional budget CPSLD Library Automated Systems Acquisitions CPSLD Sabbatical Report: An Assessment Strategy for the UFV Library 131 Category Library Metric Brief Description/Definition Reporting Library Automated Systems Cataloguing Added 2011-12: Discovery service; ERM; Link resolver CPSLD Library Automated Systems Discovery service Added 2011-12 CPSLD Library Automated Systems ERM: Electronic resource management Added 2011-12 CPSLD Library Automated Systems Interlibrary loans Library Automated Systems Link resolver Library Automated Systems OPAC CPSLD Library Automated Systems Periodicals check in CPSLD Library Automated Systems Primary source of bibliographic records CPSLD Library Automated Systems Secondary source of bibliographic records CPSLD Library Expenses (capital & operating) Audio-visual Count all expenditures on items listed in 5.1b +5.1c + 5.1d above CPSLD Library Expenses (capital & operating) Current print periodicals Count all expenditures on current subscriptions to print periodicals and indexes as included in 5.1g above CPSLD Library Expenses (capital & operating) Electronic monographs Count all expenditures on items listed in 5.2a above CPSLD Library Expenses (capital & operating) Electronic serial titles Count all expenditures on items listed in 5.2c above CPSLD CPSLD Added 2011-12 Sabbatical Report: An Assessment Strategy for the UFV Library CPSLD 132 Category Library Metric Brief Description/Definition Reporting Library Expenses (capital & operating) Monographs Count all expenditures on items listed as monographs in 5.1a above CPSLD Library Expenses (capital & operating) Other Count all other library expenditures (i.e. those CPSLD expenditures that are not related to personnel and collections); include only those other costs (e.g. printing, postage, interlibrary loans, mileage, conferences, supplies, etc.) if they are paid by the library's budget; use explanatory notes to indicate any variations from the norm Library Expenses (capital & operating) Other electronic resources Electronic resources not captured elsewhere (e.g., statistics, A&I, databases of art works, etc.) - added 201213 CPSLD Library Expenses (capital & operating) Personnel (salaries & benefits) Count all salary & benefit expenditures for library personnel as listed in 4 above; exclude personnel not covered by this survey (e.g. AV equipment distribution, Media Production or IMS); include personnel working under special grant funding CPSLD Library Expenses (capital & operating) Special funding envelope for collections Include all expenditures from special funding sources (e.g., CPSLD one-time capital grants) Library Expenses (capital & operating) Streaming media Count all expenditures on items listed in 5.2b above Library Expenses (capital & operating) Subtotal electronic collections CPSLD Library Expenses (capital & operating) Subtotal physical collections CPSLD Library Expenses (capital & operating) Total collections expenditures CPSLD Sabbatical Report: An Assessment Strategy for the UFV Library CPSLD 133 Category Library Metric Brief Description/Definition Reporting Library Expenses (capital & operating) Total library expenditures Library Personnel FTE librarians Include all part-time and contract librarian hours, Library Director (whether administrative position or not), positions funded by special grants CPSLD Library Personnel FTE library staff Exclude personnel who are entirely devoted to AV equipment and media production/IMS activities CPSLD Library Personnel FTE other professionals Include staff members who are not librarians in the strict sense of the terms such as computer experts, systems analysts or budget officers CPSLD Library Personnel FTE student aides Include student aides and work-study employees CPSLD Library Personnel Library personnel All personnel in FTE (not headcounts) CPSLD Library Personnel Sub-total FTE library personnel Provide subtotal of personnel before counting student aides and work-study employees CPSLD Library Personnel Total FTE library personnel CPSLD Number of Campuses Number of Campuses with library staff Campuses or centres with library staffing CPSLD Number of Campuses Number of Campuses without library staff Campuses or centres with facilities characterized as CPSLD reading rooms and/or learning centres which have no staff or may have staff paid for by other parts of the institution. Type of Library Type of library B.C. government categorization (e.g., college, institute, university) Use Direct circulation Count all items which are charged out for use, whether CPSLD the use is inside (e.g. reserve) or outside the library; include self/online renewals; exclude items charged out to other libraries on interlibrary loan (included in 6g below) Use Directional questions CPSLD Sabbatical Report: An Assessment Strategy for the UFV Library CPSLD CPSLD 134 Category Library Metric Brief Description/Definition Use Gate count Count all traffic upon exit from the library (usually CPSLD provided via an electric eye on the library security system) Use In-library use Count those items being used in the library and re-shelved CPSLD by library employees, but that have not been charged out for use in the direct circulation transaction Use Interlibrary loans received (include all formats) Count all items actually received via interlibrary loan; CPSLD include items received via any delivery method and from or to any type of library (including agencies with library holdings) in any part of the world; all NET, MEC, OJAC, and fileserver project figures are supplied by the ELN office; exclude intercampus loans within your institution Use Interlibrary loans sent (include all formats) Count all items actually sent via interlibrary loan; include items sent via any delivery method and from or to any type of library (including agencies with library holdings) in any part of the world; all NET, MEC, OJAC, and fileserver project figures are supplied by the ELN office; exclude intercampus loans within your institution CPSLD Use Number of participants at group presentations Report total number of participants in the presentations count all students receiving bibliographic instruction whether in tutorial groups, tours, or library skills classes; personal one-on-one instruction in the use of sources should be counted as a reference transaction CPSLD Use Number of presentations to groups Report the total number of library instruction sessions CPSLD during the year. Count sessions presented as part of formal bibliographic instruction programs including class presentations, orientation sessions and tours. If the library sponsors multi-session credit courses that meet several times over the course of a semester, each session should Sabbatical Report: An Assessment Strategy for the UFV Library Reporting 135 Category Library Metric Brief Description/Definition Reporting be counted. Presentations both on and off the premises should be included when they are sponsored by the library. If you are using sampling, please include a footnote; include internet research or database searching classes; include classes taught to faculty and other employee groups Use Reference questions An information contact that involves the knowledge, use, CPSLD recommendations, interpretation or instruction in the use of one or more information sources by a member of the library staff. Information sources include printed and nonprinted materials, machine-readable databases (including computer-assisted instruction), catalogues and other holdings, records and, through communication or referral, other libraries and institution, and persons both inside and outside the library. Include information and referral services. If a contact includes both reference and directional services, it should be reported as one reference transaction. When a staff member utilizes information gained from a previous use of information sources to answer a question, report as a reference transaction, even if the source is not consulted again during this transaction. Duration should not be an element in determining whether a transaction is a reference transaction. Sampling of a typical week may be used to extrapolate for a full year (CARL definition). Count all questions handled (i.e. regular, and extended reference questions); • include questions associated with electronic searching; include electronic reference questions; include Sabbatical Report: An Assessment Strategy for the UFV Library 136 Category Library Metric Brief Description/Definition Reporting real-time-reference interactions (i.e. virtual reference, AskAway); exclude questions associated specifically with a bibliographic instruction class; exclude directional questions as these will be counted separately. Use Total reference transactions CPSLD Users Fee amount ($) CPSLD Users FTE faculty Provide FTE (not head count) of faculty employees (include librarians if applicable); approximate if actual number is not available Users FTE students Based on data supplied by Ministry of Advanced Education CPSLD Users Is there a community borrowers fee? CPSLD Collaborate across Boundaries Amount of space available in library to host events (square metres) Education Plan 2016-20 Collaborate across Boundaries Number of bookable groups study rooms Education Plan 2016-20 Collaborate across Boundaries Number of items in HarvestIR (institutional repository) Collaborate across Boundaries Number of library exhibits produced in collaboration with UFV programs and groups Education Plan 2016-20 Collaborate across Boundaries Number of makerspace events in Abbotsford library Education Plan 2016-20 Collaborate across Boundaries Number of participants at makerspace events in Abbotsford library Education Plan 2016-20 Collaborate across Boundaries Number of participants in programs hosted in library spaces Education Plan 2016-20 Also types of items? Faculty vs student items? Library items? Sabbatical Report: An Assessment Strategy for the UFV Library CPSLD Education Plan 2016-20 137 Category Library Metric Brief Description/Definition Reporting Collaborate across Boundaries Number of programs hosted in library spaces Collaborate across Boundaries Number of programs with items in HarvestIR (institutional repository) Collaborate across Boundaries Number of spaces available for faculty Use metrics as well? to meet with students Education Plan 2016-20 Commit to Flexibility and Responsiveness "Decisions are timely" Education Plan 2016-20 Commit to Flexibility and Responsiveness Number &/or % of decisions supported by evidence (data) Education Plan 2016-20 Commit to Flexibility and Responsiveness Number &/or % of spaces that are accessible Education Plan 2016-20 Commit to Flexibility and Responsiveness Number &/or % of systems (web site, other in-house online tools) that are responsive and device-neutral Education Plan 2016-20 Commit to Flexibility and Responsiveness Number of hours open Commit to Flexibility and Responsiveness Number of library staff with increased Appropriate to level of authority; how do we measure autonomy to make decisions autonomy? And increased from what? Education Plan 2016-20 Commit to Flexibility and Responsiveness Number of position descriptions that have been revised/updated Education Plan 2016-20 Commit to Flexibility and Responsiveness Number of workshops directed to faculty and staff Education Plan 2016-20 Develop Local and Global Citizenship "Make UFV campus libraries welcoming and accessible to indigenous students" - how do we measure this? Education Plan 2016-20 By collection? Need to establish a benchmark - what is the ideal time frame for making decisions? Access to library spaces - during September - April? Entire year? "Assessment measures developed in collaboration with partners (Indigenous Affairs, Indigenous Student Centre)" Sabbatical Report: An Assessment Strategy for the UFV Library Education Plan 2016-20 Education Plan 2016-20 Education Plan 2016-20 138 Category Library Metric Brief Description/Definition Develop Local and Global Citizenship Extent of UFV archives (m) Education Plan 2016-20 Develop Local and Global Citizenship Number &/or % of library staff participating in events for international students Education Plan 2016-20 Develop Local and Global Citizenship Number &/or % of library staff participating in workshops for international students Education Plan 2016-20 Develop Local and Global Citizenship Number of art pieces and images in library spaces reflecting local Stó:lō culture Education Plan 2016-20 Develop Local and Global Citizenship Number of digital objects in UFV archives Education Plan 2016-20 Develop Local and Global Citizenship Number of events hosted for international students Education Plan 2016-20 Develop Local and Global Citizenship Number of resources in library collections from indigenous authors Need some way to identify them in catalogue, HarvestIR, etc. Education Plan 2016-20 Develop Local and Global Citizenship Number of resources in library Need some way to identify them in catalogue, HarvestIR, collections on topic of Indigenizing the etc. Academy Education Plan 2016-20 Develop Local and Global Citizenship Number of workshops (PD events) for library staff on working with international students Education Plan 2016-20 Develop Local and Global Citizenship TRC guidelines - what metric could we use? Education Plan 2016-20 Develop Local and Global Citizenship Use of library guides and resources on What constitutes "use"? Indigenizing the Academy Education Plan 2016-20 Sabbatical Report: An Assessment Strategy for the UFV Library Reporting 139 Category Library Metric Brief Description/Definition Integrate Experiential Learning Number of CCR projects developed by the library Education Plan 2016-20 Integrate Experiential Learning Number of library projects completed through collaboration with students in UFV courses Education Plan 2016-20 Integrate Experiential Learning Number of library projects initiated in collaboration with students in UFV courses Education Plan 2016-20 Integrate Experiential Learning Number of positive experiences expressed by LIBIT students who've had a practicum in the UFV library Education Plan 2016-20 Integrate Experiential Learning Number of practicum students from LIBIT program Education Plan 2016-20 Integrate Experiential Learning Number of students with CCR credit from the library Education Plan 2016-20 Prioritize Learning Everywhere % of UFV courses with a library presence in Blackboard Education Plan 2016-20 Prioritize Learning Everywhere Number of online tutorials Need to define what constitutes an online tutorial Education Plan 2016-20 Prioritize Learning Everywhere Number of UFV programs with information competency map Map refers to path within program demonstrating where students have explicit opportunities to develop information competency Education Plan 2016-20 Prioritize Learning Everywhere Use of online tutorials Need to specify what constitutes a use Education Plan 2016-20 Sabbatical Report: An Assessment Strategy for the UFV Library Reporting 140 Appendix F: Tableau Community Forums Thread 1: Displaying Date Parts as Text Names https://community.tableau.com/thread/231453 Colleen Bell 24-Mar-2017 01:51 Displaying Date Parts as Text Names This question has been Answered. I'm using a parameter and calculated field to change the display based on different dimensions. I've created dimensions - Month, Weekday, and Hour - based on date parts of two fields: Date and Start Time. I formatted them to display the full name or 12-hour representations, rather than the integer value. I've created a parameter and a calculated field as follows: CASE [Time Parameter] WHEN "Month" THEN [Month] WHEN "Day" THEN [Weekday] WHEN "Hour" THEN [Hour] END When I place it on the row shelf, it returns integer values, which aren't nearly as understandable as the month names, weekdays, or 12-hour clock. I've also tried using the DATEPART function, but get the same results. I've tried googling a solution, but if one has been published, it is eluding me. How can I get a textual instead of numeric format? Thanks, Colleen [workbook attached] JIM DEHNER 24-MAR-2017 06:26 (IN RESPONSE TO COLLEEN BELL) CORRECT ANSWER 1. RE: DISPLAYING DATE PARTS AS TEXT NAMES Hi Collen see if this works for you - datename returns the literal CASE [Time Parameter] WHEN "Month" THEN DATEname('month',[Date]) Sabbatical Report: An Assessment Strategy for the UFV Library 141 WHEN "Day" THEN DATEname('weekday',[Date]) WHEN "Hour" THEN DATEname('hour',[Start Time]) END Jim [screenshot] COLLEEN BELL 24-MAR-2017 09:13 (IN RESPONSE TO JIM DEHNER) 2. RE: DISPLAYING DATE PARTS AS TEXT NAMES Thanks, Jim. I thought I had tried DATEname, but maybe I was dreaming it. It worked great for months and weekdays. For hours, I still get the hour (on the 24-hour clock). I would love to see hour displayed on the 12-hour clock (e.g., 9 AM, 6 PM), as it is in the rest of my viz (but if I can't, I'm not going to worry about it). Any thoughts? JIM DEHNER 24-MAR-2017 09:21 (IN RESPONSE TO COLLEEN BELL) 3. RE: DISPLAYING DATE PARTS AS TEXT NAMES Yes I know how to make that change but I am out till later today. I will send it over when I get back. Glad to help. Please mark the response correct to close the thread Thanks Jim JIM DEHNER 24-MAR-2017 11:54 (IN RESPONSE TO JIM DEHNER) 4. RE: DISPLAYING DATE PARTS AS TEXT NAMES Hi Just got back - the way your data is structured this took a little more than I thought - and I had to go with the brute force method but your Spacial Time calculation becomes CASE [Time Parameter] WHEN "Month" THEN DATENAME('month',[Date]) WHEN "Day" THEN DATENAME('weekday',[Date]) WHEN "Hour" THEN (IF (([Hour])) >12 then (str([Hour])) +' PM' else (str([Hour]) +' AM') end) END Sabbatical Report: An Assessment Strategy for the UFV Library 142 It should look like this when done [screenshot] Hope it meets your needs Jim JIM DEHNER 24-MAR-2017 11:56 (IN RESPONSE TO JIM DEHNER) 5. RE: DISPLAYING DATE PARTS AS TEXT NAMES OOPs Obviously wrong - it is CASE [Time Parameter] WHEN "Month" THEN DATENAME('month',[Date]) WHEN "Day" THEN DATENAME('weekday',[Date]) WHEN "Hour" THEN (IF (([Hour])) >12 then (str([Hour]-12)) +' PM' else (str([Hour]) +' AM') end) END [screenshot] COLLEEN BELL 25-MAR-2017 00:59 (IN RESPONSE TO JIM DEHNER) 6. RE: DISPLAYING DATE PARTS AS TEXT NAMES Thanks! Worked like a charm. Only change I made was to add a condition for 12 PM, as follows: CASE [Time Parameter] WHEN "Month" THEN DATENAME('month',[Date]) WHEN "Weekday" THEN DATENAME('weekday',[Date]) WHEN "Hour" THEN (IF [Hour] = 12 THEN (STR([Hour])) + ' PM' ELSEIF [Hour] > 12 THEN (STR([Hour]-12)) +' PM' ELSE (STR([Hour]) + ' AM') END) END JIM DEHNER 25-MAR-2017 07:36 (IN RESPONSE TO COLLEEN BELL) 7. RE: DISPLAYING DATE PARTS AS TEXT NAMES Glad that worked out - it is kind of a brute force way to do it Sabbatical Report: An Assessment Strategy for the UFV Library 143 Jim Thread 2: Using a parameter to select multiple dimensions https://community.tableau.com/thread/231454 Colleen Bell 24-Mar-2017 02:01 Using a parameter to select multiple dimensions This question has been Answered. I'm working with a viz that uses 2 dimensions in the columns and 1 dimension in the rows. I've created parameters for each shelf, and the associated calculated fields. But I'm having difficulty with getting 2 dimensions on the columns shelf. I suspect it's just incomplete knowledge of Tableau (I've only been working intensively with it for a few weeks). I've tried googling the problem, but haven't yet found a solution. Here's what I have as my calculation: CASE [Timeframe Parameter] WHEN "Month/Weekday" THEN [Month], [Weekday] WHEN "Month/Hour" THEN [Month], [Hour] WHEN "Weekday/Hour" THEN [Weekday], [Hour] END I know the problem is with the stuff following THEN, and that it has something to do with data types, but I have no idea how to fix it. Any assistance would be appreciated. Thanks, Colleen PS I'm using version 10.2 [workbook attached] JIM DEHNER 24-MAR-2017 12:16 (IN RESPONSE TO COLLEEN BELL) 1. RE: USING A PARAMETER TO SELECT MULTIPLE DIMENSIONS Hi Colleen Saw you had this question also Sabbatical Report: An Assessment Strategy for the UFV Library 144 What you want to do is not possible - the when - then part of a case statement is a one-to-one relation and if you try to nest or use multiple statements with the same "When" you will only get the result for the first statement Think you will need to break it into separate parameters Jim Multiple values in a case statement COLLEEN BELL 25-MAR-2017 00:59 (IN RESPONSE TO JIM DEHNER) 2. RE: USING A PARAMETER TO SELECT MULTIPLE DIMENSIONS That's disappointing, but I've managed to make it work with two parameters. Not as elegant, but it does the job. Thanks. JIM DEHNER 25-MAR-2017 07:35 (IN RESPONSE TO COLLEEN BELL) 3. RE: USING A PARAMETER TO SELECT MULTIPLE DIMENSIONS Sorry you had to use 2 parameters Thanks for the badge - always appreciated Jim Thread 3: No heatmap data showing on polygons/background image https://community.tableau.com/thread/231458 Colleen Bell 23-Mar-2017 17:24 No heatmap data showing on polygons/background image This question has been Answered. I'm using Tableau 10.2, and I want to overlay data on a background image (floor plan). I actually managed to do it in another file, but in doing so I somehow ended up with corrupted data. So I'm recreating it step by step. I've managed to create the polygons, but the data won't show. Oddly enough, if I change the marks to a filled map, the data shows up - just not when it's displaying the polygons. Any suggestions? This is my first time working with a background image and polygons. I found a lot of help so far on the web, but not for this particular problem. Thanks, Sabbatical Report: An Assessment Strategy for the UFV Library 145 Colleen [workbook attached] LEI CHEN Ambassador 23-MAR-2017 19:23 (IN RESPONSE TO COLLEEN BELL) 1. RE: NO HEATMAP DATA SHOWING ON POLYGONS/BACKGROUND IMAGE Hello Colleen, Fantastic viz! Please try modifying [Average Occupancy (%)] into, { FIXED [Location ]: AVG([Count]/[Capacity])*100} And re-place it on color mark. Regards Lei COLLEEN BELL 23-MAR-2017 19:37 (IN RESPONSE TO LEI CHEN) 2. RE: NO HEATMAP DATA SHOWING ON POLYGONS/BACKGROUND IMAGE Thanks, Lei, for such a quick response. That fixed the heatmap issue, but the filters aren't working (not sure if they were working before, because I didn't have data). So it has pretty colours, but it doesn't tell much of a story. I'm attaching the revised workbook (it has the whole file, not just the one worksheet). [workbook attached] LEI CHEN Ambassador 23-MAR-2017 19:59 (IN RESPONSE TO COLLEEN BELL) 3. RE: NO HEATMAP DATA SHOWING ON POLYGONS/BACKGROUND IMAGE Hello COLLEEN, Thanks for sharing the workbook! For the Month/Day/Hour filters in the Map worksheet, I tested and found that, after changing them into "Add to Context", they begin to work. I don't quite understand the reason why, anyway for your reference. Regard Lei Sabbatical Report: An Assessment Strategy for the UFV Library 146 COLLEEN BELL 23-MAR-2017 22:32 (IN RESPONSE TO LEI CHEN) 4. RE: NO HEATMAP DATA SHOWING ON POLYGONS/BACKGROUND IMAGE Thanks so much, Lei. It worked perfectly! I've been puzzling over this for 2 days. Colleen LEI CHEN Ambassador 23-MAR-2017 23:28 (IN RESPONSE TO COLLEEN BELL) 5. RE: NO HEATMAP DATA SHOWING ON POLYGONS/BACKGROUND IMAGE So happy to hear that! And I'm really impressed by your beautiful and creative viz!! Regards Lei SHINICHIRO MURAKAMI Ambassador 23-MAR-2017 23:50 (IN RESPONSE TO LEI CHEN) 6. RE: NO HEATMAP DATA SHOWING ON POLYGONS/BACKGROUND IMAGE FYI Here is a link where you find LOD and filtering layer explanation. Filters and Level of Detail Expressions Thanks, Shin Sabbatical Report: An Assessment Strategy for the UFV Library 147 Appendix G: BC Library Conference Panel Details 3x3 in Search of an Assessment Plan Date: Thu Apr 20 2017, 9:00am–10:15am, Salon E Description Three librarians at three different libraries in search of an assessment plan. How do you do it when assessment is but one of several responsibilities, and when your library's assessment practices are at different levels of development? Hear three different stories about the challenges, successes, and process of assessment planning at some of BC's newer universities. Participants will have the opportunity to consider a planning strategy for their own library's assessment efforts. Speakers Colleen Bell, University of the Fraser Valley Amy R M Paterson, Thompson Rivers University Laura Thorne, University of British Columbia, Okanagan Campus Sabbatical Report: An Assessment Strategy for the UFV Library 148 My Slides and Notes Sabbatical Report: An Assessment Strategy for the UFV Library 149 6556 FTE students, 724 faculty, 678 staff Pretty flat structure Librarians     Service providers (reference, instruction) Liaisons/Selectors (multiple liaison areas) Functional specialists/managers (multiple areas) UL Designate (1 or more committees) Library Technicians    Service providers (circulation, reference) Functional specialists (1 area) Lead technicians (3 FTE) Sabbatical Report: An Assessment Strategy for the UFV Library 150 No assessment librarian   UL + me (off the side of the desk) Trying to build culture of assessment – very slow going Strategic plan in progress (for several years) Some assessment activities     LibQUAL+ (2005, 2006, 2007, 2010, 2013) Occasional surveys (e.g., hours) Space Use Study (”fish or cut bait”) Strategic plan consultations Sabbatical Report: An Assessment Strategy for the UFV Library 151 Stated expectations for assessment position (also tech interest) 4-month sabbatical          Data audit Systems for managing data/assessment Metrics Methodologies Tools (e.g., data analysis) Site visits Literature/strategic documents review Scholarly communication Draft plan/framework (tied to library/institutional strategic initiatives) Sabbatical Report: An Assessment Strategy for the UFV Library 152 Wiki, shared network drive Mostly inputs/outputs, not a lot of outcomes or impact metrics Data is touched by a lot of hands as it is compiled    Workflow for monthly PS Stats Annual reporting to consortia, institutional documents (e.g., factbook) Collection/service decisions, some operational decisions? Considerations:    Personally identifiable information -> privacy, FIPPA Research ethics review (especially if you want to share assessment results) Aggregate vs individual data? Sabbatical Report: An Assessment Strategy for the UFV Library 153 Who is responsible for it? What is it? When is it collected? Where is it archived? Why is it collected? How is it used? Sabbatical Report: An Assessment Strategy for the UFV Library 154 Excel LibInsight  Used by most to store/analyze data  Variety of predefined data types + custom  Basic pivot tables, formulas BlueCloud Analytics  Automated harvesting of data  Tied directly to ILS (SirsiDynix)  Requires login (no public view?) Dedoose  Upload other datasets  Qualitative data analysis  Requires login (no public view)  Quantitative analysis of qualitative  Interactive dashboards data  Incomplete data Tableau  Data visualization  Interactive dashboards  Share publicly  Quantitative analysis of qualitative data Sabbatical Report: An Assessment Strategy for the UFV Library 155 Sabbatical Report: An Assessment Strategy for the UFV Library 156 It’s a lot - how do you figure out which ones are important? Sabbatical Report: An Assessment Strategy for the UFV Library 157 Documents (all available on web site)  Student success: access, transition,  Strategic Directions retention, graduation  Strategic Enrolment Management (SEM) Plan Thematic Areas:  Community, Justice, and Cultural  Education Plan Engagement  Institutional Learning Outcomes (ILOs)  Environment and Sustainable Development  Strategic Research Plan  Human Development, Health and Well Being  Institutional Accountability Report  Teaching, Learning, and Cognition Strategic Goals:  Technology, Modelling and Applications  Provide the best undergraduate education in Education Plan: shifting locus of control for learning Canada  Be a leader of social, cultural, economic, and needs to the learner  Prioritize learning everywhere environmentally responsible development in the Fraser Valley  Commit to flexibility and responsiveness  Be innovative, entrepreneurial, and  Collaborate across boundaries accountable in achieving our goals  Develop local and global citizenship SEM Plan  Integrate experiential learning  Recruitment: domestic, international, aboriginal, transfer, graduate Sabbatical Report: An Assessment Strategy for the UFV Library 158 Library services include reference, instruction, interlibrary loans, and course reserves. Library operations refer to specific processes or functions performed within the library, such as acquisitions, cataloguing, systems, communications, and marketing & promotion. Collection assessment activities within the plan will be informed largely by Patti Wilson’s sabbatical leave report from 2014, which details a number of methods and tools for evaluating and assessing the UFV Library’s collections. Sabbatical Report: An Assessment Strategy for the UFV Library 159 Sabbatical Report: An Assessment Strategy for the UFV Library 160 Sabbatical Report: An Assessment Strategy for the UFV Library 161 Handout SELECTED RESOURCES FOR ASSESSMENT LIBRARIANS 3x3 in Search of an Assessment Plan :: BC Library Conference :: April 20, 2017 Amy Paterson Thompson Rivers University apaterson@tru.ca Colleen Bell University of the Fraser Valley colleen.bell@ufv.ca Laura Thorne UBC Okanagan laura.thorne@ubc.ca Websites & Blogs Value of Academic Libraries (VAL) ACRL web site on assessing value and impact of academic libraries; in addition to the VAL report, offers a blog and extensive bibliography on topics such as student retention, achievement and engagement, faculty research productivity and teaching, and communicating value. http://www.acrl.ala.org/value/ LibValue Toolkit Describes a range of methodologies and project to assess teaching and learning, scholarly reading, comprehensive value, information or learning commons, digitized special collections and e-books. Also provides a bibliographic database of more than 1,000 citations. Culmination of a 3-year study led by Carol Tenopir and Paula Kaufman. http://www.libvalue.org/about/toolkit Information is Beautiful Web site and blog by David McCandless; focus is on data visualization. Truly beautiful. http://www.informationisbeautiful.net/ Everyday Analytics Blog by Myles Harrison; focus is on using tools and techniques appropriately. http://www.everydayanalytics.ca/ Listservs ARL-ASSESS Open to anyone interested in library assessment. Focus is on academic libraries, but it’s a good way to find out about professional development opportunities, tools, and methods. https://groups.google.com/a/arl.org/forum/#!forum/arl-assess Sabbatical Report: An Assessment Strategy for the UFV Library 162 Conferences & Workshops Library Assessment Conference Biennial (in even years) international conference on assessment in libraries of all types. Held in the US. Next: Fall 2018 in Houston, TX. http://libraryassessment.org/ Canadian Library Assessment Workshop Biennial (in odd years) workshop with a focus on developing skills and strategies for library assessment. Next: October 26-27, 2017 in Victoria, BC. http://www.carl-abrc.ca/strengtheningcapacity/workshops-and-training/canadian-library-assessment-workshop/ International Conference on Performance Measurement in Libraries and Information Services Biennial (in odd years) international conference, also known as “Northumbria.” Held in the UK. Next: July 31-August 2, 2017 in Oxford, UK. https://libraryperformance.org/ International Evidence Based Library and Information Practice Conference Biennial (held in odd years), held in various countries around the world. Next: June 18-21, 2017 in Philadelphia, PA. http://library.usask.ca/ceblip/eblip/eblip-conferences1.php Qualitative and Quantitative Methods in Libraries Conference (QQML) Annual, usually held in Europe. Next: May 23-26, 2017 in Limerick, Ireland. http://www.isast.org/ Southeastern Library Assessment Conference Biennial (in odd years), held in the southeastern US. Next: November 13-14, 2017 in Atlanta, GA. http://www.southeasternlac.info/ Association of Institutional Research (AIR) Forum Annual conference, held in the US. Not specifically focused on libraries, but focuses on many of the same tools and issues as academic libraries. Next: May 30-June 2, 2017 in Washington, DC. http://forum.airweb.org/2017/pages/home Data Analysis & Visualization Software Microsoft Excel For basic statistical analysis and charts, Excel is a powerful tool that most of us already have in our toolkit. Take the time to learn how to use pivot tables and functions. Sabbatical Report: An Assessment Strategy for the UFV Library 163 SPSS Probably the most ubiquitous software for sophisticated statistical analysis of data. Often available through an institutional license. https://www.ibm.com/analytics/us/en/technology/spss/ Alternatives to SPSS: ● Minitab (http://www.minitab.com/en-us/) - commercial ● SAS/STAT (https://www.sas.com/en_ca/software/analytics/stat.html) - commercial ● R (https://www.r-project.org/) - free, installs on UNIX server Tableau Data visualization software. Offers free and paid versions; fully functional desktop software, but free version only allows sharing through Tableau Public server. https://www.tableau.com/ Dedoose Inexpensive, web-based software for qualitative data analysis. http://www.dedoose.com/ nVIVO Software for qualitative data analysis. Pricey, but many institutions have licensed it. http://www.qsrinternational.com/product Voyant Tools Free, web-based environment for analyzing and visualizing digital texts. https://voyanttools.org/ TAPoR (http://tapor.ca/home) is the updated and expanded version, but requires login. Books & Articles Baird, B. J. (2004). Library collection assessment through statistical sampling. Lanham, MD: Scarecrow Press. Barbrow, S., & Hartline, M. (2015). Process mapping as organizational assessment in academic libraries. Performance Measurement and Metrics, 16, 34-47. https://doi.org/10.1108/PMM11-2014-0040 Broady-Preston, J., & Lobo, A. (2011). Measuring the quality, value and impact of academic libraries: The role of external standards. Performance Measurement and Metrics, 12, 122135. https://doi.org/10.1108/14678041111149327 Sabbatical Report: An Assessment Strategy for the UFV Library 164 Carlsson, H. (2016). Library assessment and quality assurance - creating a staff-driven and userfocused development process. Evidence Based Library and Information Practice, 11(2), 2833. https://doi.org/10.18438/B81W5X Crump, M. J., Freund, L., & Carrico, S. (2012). Meeting the needs of student users in academic libraries: Reaching across the great divide. Oxford, UK: Chandos Publishing. Dugan, R. E., Hernon, P., & Nitecki, D. A. (2009). Viewing library metrics from different perspectives: Inputs, outputs, and outcomes. Santa Barbara, CA: Libraries Unlimited. Durante, K., & Wang, Z. (2012). Creating an actionable assessment framework for discovery services in academic libraries. College & Undergraduate Libraries, 19, 215-228. https://doi.org/10.1080/10691316.2012.693358 Few, S. (2012). Show me the numbers: Designing tables and graphs to enlighten (2d ed.). Burlingame, CA: Analytics Press. Fox, R., Doshi, A., & Association of Research Libraries. (2011). Library user experience. Washington, DC: Association of Research Libraries. Hernon, P., Dugan, R. E., & Matthews, J. R. (2014). Getting started with evaluation. Chicago, IL: ALA Editions. Matthews, J. R. (2007). The evaluation and measurement of library services. Westport, Conn: Libraries Unlimited. Mengel, E., & Lewis, V. (2012). Collaborative assessment: North American academic libraries' experiences using the balanced scorecard to measure performance and show value. Library Management, 33, 357-364. https://doi.org/10.1108/01435121211266131 Miller, J. (2014). A method for evaluating library liaison activities in small academic libraries. Journal of Library Administration, 54, 483-500. https://doi.org/10.1080/01930826.2014.953387 Munde, G., & Marks, K. E. (2009). Surviving the future: Academic libraries, quality, and assessment. London: Chando Publishing. Oakleaf, M. J. (2012). Academic library value: The impact starter kit. Syracuse, NY: Dellas Graphics. Pedramnia, S., Modiramani, P., & Ghavami Ghanbarabadi, V. (2012). An analysis of service quality in academic libraries using LibQUAL scale. Library Management, 33, 159-167. https://doi.org/10.1108/01435121211217144 Secolsky, C., Ed, & Denison, D. B. (Eds.). (2011). Handbook on measurement, assessment, and evaluation in higher education. New York: Routledge. Sabbatical Report: An Assessment Strategy for the UFV Library 165 White, A. C., & Kamal, E. D. (2006). E-metrics for library and information professionals: How to use data for managing and evaluating electronic resource collections. New York: NealSchuman Publishers. Wright, S., & White, L. S. (2007). SPEC kit 303: Library assessment. Washington, DC: Association of Research Libraries. Yau, N. (2011). Visualize this: The flowing data guide to design, visualization, and statistics. Indianapolis, IN: Wiley. Journals ● ● ● ● ● ● ● ● College & Research Libraries Evidence Based Library and Information Practice Journal of Academic Librarianship Journal of Library Administration Library Quarterly Performance Measurement & Metrics portal: Libraries & the Academy Reference & User Services Quarterly Sabbatical Report: An Assessment Strategy for the UFV Library 166