Conference Program Abstracts for Empirical Librarians 2016
Sarah Arnold, University of North Carolina at Chapel Hill
At the University of North Carolina at Chapel Hill (UNC) subject guides are created using a third-party software called LibGuides. A subject guide is a website created to give students an overview of the library resources related to a specific topic. Some examples include "Resources on Endangered Species" and "How to Find Book Reviews." A common problem is the under utilization of subject guides during the research process at universities including UNC. The UNC Libraries' User Experience department has begun developing a usability test that will involve surveying and then interviewing users with the goal of improving the overall user experience of our subject guides and thus, increasing their use over time.
Our survey was created using Qualtrics through UNC's campus wide license. In addition to the typical demographic and library use questions, we will be using Qualtrics's heatmap feature to perform a basic A/B test of the tabbed versus side navigation layouts in LibGuides. The survey will provide us with baseline data on how our users think of subject guides as well as act as a recruiting tool for our follow-up usability testing. We plan to have the survey portion of our study complete prior to the conference. For the usability testing, we will schedule 3 rounds of testing with 5 users each. With each round of usability testing, we will focus on small portions of the subject guide layout: tabbed versus side navigation, number of columns per page, and number of items in a content box.
For our presentation, we will discuss our planning process which includes applying for IRB exemption (study #15-3000), developing a survey in Qualtrics, and laying out usability tasks for users to complete. Related topics that we will discuss are other online usability tools that assist in user testing, best practices for planning a usability study of this scope and size, and collaboration between librarians and LIS students for usability testing. We believe conference attendees will benefit from our experiences because we will provide insight on how to develop a usability study and demonstrate how doable it is even if you have no experience with user testing.
Terry W. Brandsma, UNC Greensboro
User-centered design for academic library websites must take the needs, wants and limitations of users into account. The UNCG University Libraries conducted a comprehensive, year-long usability study that examined the design, content, and functionality of the Libraries’ primary website. Formal usability testing methods - including online card sorting, logfile analysis, task-based usability sessions, focus groups, and surveys - were conducted with the goal of informing our user-centered redesign. Testing methods and best practices will be discussed, as well as obstacles that can arise and ways to overcome them. Images of the website before and after the redesign will be presented, as well as implications for future usability testing.
Kim Copenhaver & Alyssa Koclanes, Eckerd College
The purpose of this research study was to examine shifts in the volume and complexity of reference questions received at a small liberal arts college library following the implementation of a web-scale discovery service. The Warner model of reference classification was utilized to review and classify reference questions from one academic year prior to the implementation of Ebsco Discovery Service (EDS) and one academic year following EDS implementation to evaluate the percent change in question volume and degree of question complexity as defined by Warner. Research findings were significant as they revealed a 34% decline in overall reference activity following the integration of a web-scale discovery service into the research process. This presentation will include a discussion of the literature review process, data gathering techniques utilized, question coding methodology and data analysis undertaken to draw meaningful conclusions that may be used to guide reference service redesign in libraries.
Nina Exner, North Carolina Agricultural and Technical State University
Academic librarians – and all librarians – often want to perform research. But with one class (or fewer!) in research methods in library school, we also find it a challenge. This presentation will offer a pilot model based on inductive analysis of interviews and observations from a group of librarians who attended an Institute to improve their research skills. What challenges did they face, and how are they working through them? How does their library help or hinder them? What other people and organizations do they turn to for support in learning to be researchers? The presentation will look at these and other issues and share the preliminary version of the research competency development model. We will then discuss the model together, asking for audience input and thoughts on how the model does or doesn’t reflect their experiences as librarians who are interested in Empirical Librarianship.
Scott Goldstein, Appalachian State University
Electronic surveys are a crucial tool for gathering data from a large number of students, faculty, and staff. The most common method of recruiting participants, emailing a hyperlink to a listserv, has some methodological limitations and, at many universities, requires inconvenient approval and acceptance of constraints if it is permitted at all. An alternative approach to recruitment was tried at Appalachian State University Libraries in which a random sample of email addresses was loaded directly into the Qualtrics survey software. The email addresses were “scraped” from the university’s online public directory. This presentation will discuss the advantages to the individualized email functionality in modern survey software, how the email addresses were collected from the public directory, and future projects in which this recruitment technique will prove uniquely expedient.
Karen Stanley Grigg, University of North Carolina at Greensboro
In 2014, members of the UNCG Transfer Student Research Project submitted a proposal for further research on incoming transfer students, their information needs, and their information literacy skills to the Association of College & Research Libraries (ACRL)'s Assessment in Action: Academic Libraries and Student Success program. This 14-month long program, funded by a National Leadership Demonstration Grant, "supports the design, implementation and evaluation of a program to strengthen the competencies of librarians in campus leadership and data-informed advocacy." The primary investigator was required to assemble a team of key stakeholders both internal and across campus, who would participate in this research project. Additionally, those whose were accepted into the program work in cohort teams, attend all ALA conferences, participate in online forums and programming, and give a presentation at ALA at the end of the program. UNCG's proposal was accepted, and two research projects are underway. The first project is a pre-test, intervention, and post test assessment in Foundations for Learning (FFL) 250, a course taken by incoming adult students. The second project is a survey of second year transfer students that will assess information literacy skills and compare those students who have had librarian interventions with those who have not. It will also compare the skills of students from a variety of transfer institutions, different majors, age ranges, and time lapse between their last institution and UNCG. This presentation will describe the Assessment in Action program and how research and assessment skills are imparted, as well as describing the specific studies done at UNCG for these research projects.
Assessing the value of library instruction in an intro to the design process course: An Assessment in Action project
Nastasha Johnson, Purdue University Libraries
Purdue University Libraries assessed the impact of the face-to-face and online library instruction, library collaboration, and the overall impact of technology students in an introductory design process course. The purpose of the project was to ascertain whether information skills had improved over the course of the semester. Three bibliographic assignments were analyzed, along with three student reflections. A random selection of 160 students’ Information Literacy (IL) skill development and performance was rated in three successive, bibliographic-rich assignments over the semester, thereby charting student growth. These written assignments were also coupled with students’ successive perceptions of their IL skills over the course. NVIVO software was used for qualitative analysis of responses. Triangulation between themes in students’ IL skill perception, demonstrated citation quality and graded performance will be discussed. Comparison about the impact of online IL instruction and face-to-face IL instruction will also be examined, as the students viewed both asynchronous and synchronous library instruction sessions. Lesson learned regarding campus-wide research project coordination, development, and execution will be discussed. The results of the project will also be discussed, along with future directions of the project.
Nastasha Johnson, Purdue University
Purdue University librarians have created an undergraduate data literacy skills assessment tool. The purpose of the tool is to identify pre-existing data management skills in informal and formal settings, like labs and classroom alike. This knowledge will lead to customized instructional interventions for the needs of the students and faculty. The tool was created in response to disciplinary faculty requests to create instructional modules on data management for their students, with relevant content. The developers realized that little was known about the existing skills of the students. The tool was developed by mapping the data information literacy competencies to actual concrete tasks that the students may perform in and out of school. The survey, built in Qualtrics, may be used as a pre- or post- assessment, or as a diagnostic assessment of students’ skills. In this lightning talk we will share the process of creating the tool and its first iteration.
Teresa W. LePors, Elon University
The department of Library Research and Scholarly Services at Elon University’s Belk Library exists to strengthen liaison librarians’ interactions with faculty. Liaison librarians have shared current literature in the field and discussed best practices as a means of establishing benchmarks for effective liaison activities. To better inform liaison strategies, I analyzed emails received from, and sent to, faculty over the course of a year and then used concept mapping to identify patterns of interactions. I will discuss the methodology that I used to analyze these emails and show how concept mapping helped me visualize faculty interactions. The results of this analysis will highlight time and effort involved in liaison activities, identify activities that were particularly successful, and reveal opportunities to develop new methods of interaction.
Dawn Lowe-Wincentsen & Aja Bettencourt-McCarthy, Oregon Institute of Technology
How does an undergraduate research? How do we best support student research with our website, on campus services, and subject guides? These were the questions we used in a multilevel usability study. The first step was focus groups held at two university campuses. This was followed by writing up the research, making observations, and A/B testing comparing the old layout, midterm layout, and a proposed layout. Using this information, student demographics and institutional knowledge, a small committee developed a persona to guide the development of new web pages and subject guides. This presentation will demonstrate the methods used in all three of these usability study stages: recruitment of students, development of the A/B formats, and pieces of the persona. We will also discuss the findings of the research, new subject guide development, website redesign, and how these could be applied in other settings.
Emma Oxford, Rollins College
Research in the sciences often requires copious citations. One way librarians can support this type of scientific inquiry is through instruction in the use of citation management systems (CMSes). Shockingly, many students and faculty still use unwieldy Excel documents or a teetering stack of PDFs to keep track of what they have read. There is a better way! Over the past year I have had great success providing instruction in how to use Mendeley, one of the many CMSes available and one that is heavily focuses on the sciences. Students and faculty alike appreciate how much easier the writing process is with Mendeley, and the use of a robust CMS also makes them more aware of some key elements of information literacy. I will give a short presentation on how I integrate Mendeley instruction into my library information sessions and some of the aspects of it that students and faculty have found most helpful.
Nancy Poole, UNCG SILS
Come, join, and discuss! This round table will explore the current status of libraries and under-served communities within their service areas. With the increasing need to obtain grant funding, libraries often find themselves creating programs for traditionally under-served communities and groups. Successful programs depend on identifying these communities, getting to know them, determining what they really need (as opposed to having needs convenient for libraries assigned to them), and meeting them on their own turf - particularly in rural communities, and those lacking public or other transportation.
My focus is identifying and getting to know communities. I would like to present some preliminary findings from a project in which I used a variety of methods (interviews and questionnaires) and gather research strategies others have used considering: funding and labor resource limitations, and limited centralized information available to libraries. Although my research focused on public libraries, I encourage community college, distance learning, and academic librarians interested in needs assessment with under-served groups to join and discuss how to identify and research these communities.
Investigating academic libraries via a cross-disciplinary survey research team: Multiple data sources, multiple impacts
Glenn Ellen Starr Stilling, Appalachian State University
This presentation will explore the methodology of building and deploying a large national survey of academic library directors and will present some of the survey’s preliminary findings. The main emphasis, however, will be lessons learned from working collaboratively with a cross-disciplinary research team that includes a faculty member as well as graduate and undergraduate students. It is possible that the topic of librarians working on research teams where the research topic is library-related, but the collaborators are from another discipline, is underrepresented in the practice as well as the literature of librarianship.
John Wiswell, Appalachian State University
I want to be able to gather data or do survey research on a representative sample of articles published by faculty at my mid-sized university and at a few similar to mine. In order to create a good sample, it is necessary to create a sampling frame of all, or almost all, published article in a year. We are not an R1 institution, but this is still a large set of articles and a lot of work, especially if I want to include a few variables for creating subsets. This is even more problematic if I want to compare to one or more peer universities. This presentation will examine efficient ways to approach this problem and alternative sampling methods.
Shenmeng Xu, UNC Chapel Hill School of Information and Library Science
As academics, students, science practitioners and librarians are more involved in the digital scholarly communication context, more opportunities emerge to delve into their behavioral traces. Views, downloads, and social mentions of scholarly works has been considered proxy for their visibility, attention, and impact. Since the term “altmetrics” was coined, research has been conducted on new metrics quantifying traces of these online acts; however, it is insufficient to only look at numbers when interpreting impact. For instance, tweets of a research article could be recommending it, criticizing it, asking question about it, or just perfunctorily retweeting it. Just like citation context analysis, which analyze how and why scholarly works are cited, we also need to take a closer look at the context of altmetrics data source. Focusing on interpreting the underlying meaning of altmetrics data, this talk reviews some of the earlier research, and introduces the talker’s initial empirical explorations on context analysis of altmetrics data. Numbers are not everything. Altmetrics are supposed to measure what is sensible, as opposed to what is technically feasible.