Top Banner
portal: Libraries and the Academy, Vol. 16, No. 3 (2016), pp. 619–648. Copyright © 2016 by Johns Hopkins University Press, Baltimore, MD 21218. Assessment for One-Shot Library Instruction: A Conceptual Approach Rui Wang abstract: The purpose of this study is to explore a conceptual approach to assessment for one-shot library instruction. This study develops a new assessment instrument based on Carol Kuhlthau’s information search process (ISP) model. The new instrument focuses on measuring and identifying changes in student readiness to do research along three dimensions—cognitive thoughts, affective feelings, and research actions—before and after the students receive one-shot library instruction at the beginning of their research. The development of the new instrument includes a validation process supported by statistical analyses. The results demonstrate that the students improved their research readiness after receiving one-shot library instruction. Introduction C ourse-specific single class visits, so-called one-shot library instruction, are a long-established form of library instruction. One-shot library instruction can be traced back to the late nineteenth century. In 1880, Otis H. Robinson, a librarian and professor of mathematics at the University of Rochester in New York, pioneered course-related library instruction. Robinson succeeded in “getting at least half his faculty, a large part of the students, and sometimes even the president, into the library to help students use the collections effectively.” 1 Robinson’s philosophy was that “suc- cessful library instruction depended on librarians with high standards of scholarship who were able to command respect in their separate academic communities.” 2 Since then, academic librarians have made great efforts to develop course-specific, one-shot library instruction, which “has become mainstream, occupying an accepted, respected, and expected place in librarianship” since the 1980s. 3 Recently, two sequential reports by the Primary Research Group, which publishes research reports and benchmarking studies for businesses, educational institutions, and other organizations, revealed a This mss. is peer reviewed, copy edited, and accepted for publication, portal 16.3.
30

Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Apr 03, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 619

portal: Libraries and the Academy, Vol. 16, No. 3 (2016), pp. 619–648. Copyright © 2016 by Johns Hopkins University Press, Baltimore, MD 21218.

Assessment for One-Shot Library Instruction: A Conceptual ApproachRui Wang

abstract: The purpose of this study is to explore a conceptual approach to assessment for one-shot library instruction. This study develops a new assessment instrument based on Carol Kuhlthau’s information search process (ISP) model. The new instrument focuses on measuring and identifying changes in student readiness to do research along three dimensions—cognitive thoughts, affective feelings, and research actions—before and after the students receive one-shot library instruction at the beginning of their research. The development of the new instrument includes a validation process supported by statistical analyses. The results demonstrate that the students improved their research readiness after receiving one-shot library instruction.

Introduction

Course-specific single class visits, so-called one-shot library instruction, are a long-established form of library instruction. One-shot library instruction can be traced back to the late nineteenth century. In 1880, Otis H. Robinson, a librarian

and professor of mathematics at the University of Rochester in New York, pioneered course-related library instruction. Robinson succeeded in “getting at least half his faculty, a large part of the students, and sometimes even the president, into the library to help students use the collections effectively.”1 Robinson’s philosophy was that “suc-cessful library instruction depended on librarians with high standards of scholarship who were able to command respect in their separate academic communities.”2 Since then, academic librarians have made great efforts to develop course-specific, one-shot library instruction, which “has become mainstream, occupying an accepted, respected, and expected place in librarianship” since the 1980s.3 Recently, two sequential reports by the Primary Research Group, which publishes research reports and benchmarking studies for businesses, educational institutions, and other organizations, revealed a This

mss

. is pe

er rev

iewed

, cop

y edit

ed, a

nd ac

cepte

d for

publi

catio

n, po

rtal 1

6.3.

Page 2: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach620

dramatic increase in the number of instruction sessions: a 20 percent rise between the fall semester of 2006 and the fall semester of 2007, and a 23 percent growth from 2012 to 2013.4 Although academic librarians have invented or reinvented various types of library instruction, including noncredit or credit-bearing, disciplinary-specific library courses5 and programmatic approach instruction,6 the most popular form is still one-shot library instruction. Many librarians consider one-shot instruction to be a quintessential component of library instruction.7 Students receive the instruction at a point of need or a teachable moment, and Janice Jaguszewski and Karen Williams noted, “The one-shot model is familiar and easy for faculty to incorporate into the syllabus.”8 As Julie Rabine and Catherine Cardwell observed, “Librarians devote substantial time and energy to these sessions.”9

However, the longevity of one-shot library instruction does not make its assessment easy. Few publications mention an assessment process within the framework of one-shot instruction,10 although assessment, in general, has surged and become widespread in academic libraries. Many librarians have asserted that evaluation of one-shot library instruction is “too complex and too time consuming.”11 The reason for the complexity is that most one-shot sessions “are designed to help students in specific courses with specific assignments.”12 Because of disciplinary disparities, the design, contents, and pedagogies of individual faculty members vary. The results of a longitudinal survey sent

to teaching faculty four times over a long period reveal a large variation even within a single discipline, inasmuch as individual faculty members changed their assignments from year to year.13 Consequently, librarians must constantly adjust to changes for dif-ferent course objectives covering different library resources and skills for one-shot li-brary instruction. Determining what to teach

in a one-shot session can be daunting.14 Moreover, as Rabine and Cardwell say, “The librarians conducting these sessions function in the role of guest lecturer, performing a service for the teaching faculty member. The librarian has little control over the assign-ment created by the instructor.”15

In addition, assessment for one-shot library instruction requires collaboration from faculty and students. It can be challenging to get faculty and students to fully cooperate and participate. Librarians’ assessment for one-shot library instruction is never the top priority for faculty and students. Researchers have frequently reported difficulties in collecting data about one-shot library instruction. Students’ unmotivated behaviors create additional challenges not only for the process of data-collecting but also for maintaining assessment as an ongoing practice.

The largest challenge comes from the assessment tools for one-shot library instruc-tion. These tools focus on assessing prepackaged library skills that are mostly irrelevant to the students’ research process for a specific course assignment. They fit with the As-sociation of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education (the Standards), “which provides a limited, almost formulaic approach” and valorizes the “‘information literate student’ as a construct of

Librarians must constantly adjust to changes for different course objectives covering differ-ent library resources and skills for one-shot library instruction.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 3: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 621

imagined accomplishment.”16 Meanwhile, librarians have repeatedly questioned one-shot library instruction and concluded it to have little value or effectiveness, which raises “more troubling questions regarding the level of engagement possible in one-shot library instruction.”17 Developing effective and practical instruments that are relevant to the student research process and that demonstrate the value and impact of one-shot library instruction in a meaningful way is urgent for the whole community of academic libraries.

Therefore, the purpose of this study is to develop a new assessment instrument for one-shot library instruction. Rather than focusing on testing prepackaged library skills, the new instrument aims to identify how students become better prepared to conduct research for their course assignments after re-ceiving one-shot library instruction. The new instrument is referred to as research readiness-focused assessment (RRFA), as opposed to the more traditional approach, which might be called library skills-focused assessment. The purpose of the RRFA is to measure changes in students’ cognitive thoughts, affective feelings, and behavioral actions from their responses before and after one-shot library instruction. This study hypothesizes that:

Hypothesis 1: After one-shot library instruction, students’ cognitive thinking would improve related to conducting research for their course assignment. Specifically, students would understand more about their course assignment, and they would be clearer about their research topic, where to look for sources to develop their topic, and how to conduct library research.

Hypothesis 2: After one-shot library instruction, students would feel that the course assignment was less challenging, and they would have more confidence about complet-ing course assignments successfully.

Hypothesis 3: After one-shot library instruction, students would be more willing to ask the librarian for help over their social networks, would plan to start their research earlier, and would appreciate the one-shot library session more.

Literature Review

The library literature has some reports of assessment for one-shot library instruction, although the number of such reports is not as significant as those for other types of library instruction. A popular method is the pretest and posttest to measure changes in students’ learning the use of library resources after receiving one-shot library instruction. For example, Donald Barclay employed that method in the 1990s to evaluate the ability of freshman writing students to find books and articles before and after a one-shot ses-sion. He created essentially the same two questions for his pretest and posttest, along with two satisfaction-survey questions. The results showed that the students learned the two library skills and were satisfied with the one-shot session.18 His assessment was reviewed as a realistic effort to evaluate library instruction in the real world “in terms

Developing effective and practi-cal instruments that are relevant to the student research process and that demonstrate the value and impact of one-shot library instruction in a meaningful way is urgent for the whole commu-nity of academic libraries.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 4: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach622

of time and resource limitations.”19 Chris Portmann and Adrienne Roush reported that they administered pre- and posttests to a convenience sample of 38 students enrolled in a 200-level sociology course to measure the influence of one-shot library sessions. Their data analysis indicated that the library instruction had a statistically significant positive influence on library use but not on library skills. Later, the pretest/posttest instrument evolved into large-scaled tools. For example, two librarians at Indiana University in Bloomington administered their pretest and posttest with 26 questions to 404 students in a freshman composition course. They concluded that there was no increase in scores from the pretest to the posttest regarding students’ learning of library skills and services.20

The pretest/posttest assessment tool for one-shot library instruction has also been linked to the ACRL Standards. In 2010, two librarians at Rider University in Lawrence Township, New Jersey, and at State University of New York (SUNY) College at Oneonta published their report on how they followed the Standards to develop 15 questions to measure changes in learning outcomes for information literacy concepts after a one-shot session. The 15 questions included asking students how to find books, read citations, access full text articles from databases, use interlibrary loan, and employ search strate-gies such as Boolean operators and truncation. The results showed “a positive” but not “dramatic” impact on these defined learning outcomes. The authors claimed, “The pre- and posttest instrument was able to show specific strengths and weaknesses in the students’ comprehension of IL concepts.”21 Kevin Walker and Michael Pearce reported another example of the Standards approach in 2014. Their pre- and posttest instrument, which included 24 items, was modeled on the Standards and designed to meet the goals of the first-year writing program. Their results showed significant improvement from pretest to posttest but no statistically significant difference between two instructional methods. They concluded, “One-shot instructional sessions likely do not fulfill the in-formation literacy needs of students.”22

Some experts criticized the pretest/posttest method as merely measuring short-term recall on a prescribed set of skills.23 A reference team at the Hong Kong University of Science and Technology (HKUST) in China determined that the test approach would not serve the purpose of program assessment, because the library instruction program at their institution had different teaching objectives and covered different library skills.24 Therefore, HKUST chose to use a perception survey to collect 466 student responses from 23 class-specific library instruction sessions to measure the lasting value of one-shot library instruction. To trace any enduring impact of library instruction, they adminis-tered the survey four to eight weeks after the one-shot session. The survey contained 14 questions which included asking the students what they learned from the one-shot session, what library skills they acquired, whether they continued to use these skills, and how confident they felt to do library research. The survey then asked the students to rate the library session and the library instructor. The results showed that most at-tendees remained positive about the usefulness of the one-shot sessions and claimed that they retained the skills learned.

Another method that differs from the pretest/posttest method is the one-minute paper. This technique collects students’ instant feedback during the last few minutes of class time. Elizabeth Choinski and Michelle Emanuel reported their use of the one-minute paper method to assess one-shot library instruction for a biology class and a Spanish class

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 5: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 623

at the University of Mississippi Libraries in Oxford.25 The one-minute paper consisted of four reflection questions: (1) What is the difference between an indexing and abstracting database like BasicBIOSIS—or the MLA (Modern Language Association) International Bibliography or Health and Psychosocial Instruments (HaPI)—and a full-text database like EBSCOHost? (2) Which kind of database was more useful to your research and why? (3) How can you tell if an article is from a scholarly source or from a popular magazine? and (4) How do you know if a website is suitable for academic work? Forty one-minute papers were selected from 307 papers and graded by two other librarians. The authors concluded that the one-minute paper method was an objective, quantifiable, and flexible assessment tool for one-shot library instruction.

Evaluating student papers or bibliographies is another alternative to the pretest/posttest method. Researchers have reported on this method to assess one-shot library instruction since the 1980s. Three experimental studies used student papers and bibli-ographies to identify effects of instruction and to rate different instructional methods of one-shot library instruction for sociology, writing, and engineering classes.26 Librarians or readers graded the student work based on criteria developed in each study. One of the studies found that the group who received one-shot instruction scored statistically higher than the group who were not exposed to one-shot library instruction. Another study, which aimed to identify different effects of two instructional methods, found that the cognitive strategy (identifying tools appropriate for a research topic) increased the quality of student bibliographies, compared to the traditional tool-specific approach. The third study found little difference between traditional lectures and “computer-enhanced instruction.” These reports all revealed that only a small portion of student work and one-minute papers could be processed and analyzed because to maintain consistency among different graders and to convert texts into scores were time-consuming and te-dious. Another problem is that the evaluation cannot separate the combined influence of one-shot library instruction, professors’ assistance, and student self-learning, because student work is an end product of research.

This literature review also shows that assessments for one-shot library instruction by testing student library use and skills were mostly conducted in large freshman writing or general education classes because of the uniformity of class assignments and research activities. Little assessment was done at upper undergraduate or graduate levels because instruction for these classes is often unique for discipline-specific assignments. For example, a psychology professor and a librarian’s pilot project evaluated a “super-size bibliographic instruction” (two class periods) for a psy-chology research class,27 and two librarians measured social work students’ information literacy skills after a Social Welfare Research Seminar.28 Because these discipline-specific assessments demonstrate deeply embedded instruction for specific course assignments, it is hard to use these unique measurements to replicate these assessments. Owing to the dynamic nature of one-shot library instruction and the limitations of various instruments, almost all assessments of one-shot instruc-

Owing to the dynamic nature of one-shot library instruction and the limitations of various instruments, almost all assess-ments of one-shot instruction end up as a one-shot assessment.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 6: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach624

tion end up as a one-shot assessment. As Portmann and Roush pointed out, “Without continual repeat applications, these one-study approaches do little if anything in the way of establishing an enduring assessment program for library instruction.”29

Methods

The Research Readiness-Focused Assessment (RRFA) Instrument

The literature review reveals some flaws in assessment for one-shot library instruction. Various evaluation tools rely heavily on measuring general library skills. Such prepack-aged tools test student library skills as learning outcomes but have little relevance to the early progress of student research for a specific course assignment. The evaluation of student papers or bibliographies may reveal how students utilize library skills in their coursework at the end of their research, but the process is tedious and can only assess a small amount of student work. Another limitation is that it is hard to single out the impact of one-shot library instruction on student work from influences of nonlibrary instruction. As librarians Christopher Bober, Sonia Poulin, and Luigina Vileno described two decades ago,

Basic library skills testing does not measure the students’ actual learning. Correct answers can often be given through short term recall. Although students may have mastered basic library skills, such as how to read a call number, they may not have ac-quired the conceptual knowledge necessary to adequately conduct their own research.30

Although the three librarians admitted that higher-order cognitive skills were diffi-cult to evaluate, they advocated a conceptual approach as opposed to a library skills-based approach. However, they did not specifically define what a conceptual approach was.

Based on the information search process (ISP) model developed by Kuhlthau in the 1980s, the new RRFA instrument created in this present study is a conceptual approach to assessment for one-shot library instruction. The concept is research readiness. The ap-proach aims to capture changes in student research readiness before and after receiving one-shot library instruction. The RRFA instrument consists of a pre-survey and a post-survey, both containing essentially the same 11 question items (see Appendix A and B). For practical purposes, the survey instrument must be short. Barclay called his pretest and posttests “brutally simple”: they only had two free-response questions to test how students find books and periodicals by subject.31 However, a “brutally simple” instru-ment does not necessarily yield meaningful results. A meaningful assessment depends on the validity and reliability of the instrument.

Validity

Validity is the extent to which an instrument measures what it is intended to measure. “Validity has long been regarded as the touchstone of educational and psychological measurement and has therefore been defined repeatedly with varying nuances in the assessment literature.”32 However, to date, validity of assessment for one-shot library instruction has not been defined in the professional research literature. This study at-tempts to define validity of assessment for one-shot library instruction by referencing the established practice in the social sciences for the validation process. As Edward Carmines

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 7: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 625

and Richard Zeller stressed, “One validates not the measuring instrument itself but the measuring instrument in relation to the purpose for which it is being used.”33 Many psychological testing specialists have argued that all validity is essentially construct validity.34 As Carmines and Zeller described it:

Construct validity is woven into the theoretical fabric of the social sciences, and is thus central to the measurement of abstract theoretical concepts. Indeed, as we will see, construct validation must be conceived of within a theoretical context. Fundamentally, construct validity is concerned with the extent to which a particular measure relates to other measures consistent with theoretically derived hypotheses concerning the concepts (or constructs) that are being measured.35

The validation in developing the RRFA instrument for this study included defining the domain of the assessment for one-shot library instruction, constructing a theoretical framework for the process of developing the instrument, elaborating the theoretically derived hypotheses, and applying the statistical techniques of reliability tests and factor analysis to determine the validity of the results.

The first step in developing an instrument is to define what is being measured, which is called domain definition.36 The domain definition must be traced back to the purpose of one-shot library instruction. Evan Farber, a guru of bibliographic instruction in the 1980s, had a simple and insightful comment about the purpose of library instruction: “We must remember what most students are interested in is not learning how to use the library, but only in learning how to use the library to help them with their classes.”37 Practically speak-ing, one-shot library instruction is generated by a faculty member for a particular course assignment. The library session is just one epi-sode in the whole semester of a class. Usually, the professor uses one-shot library instruction to initiate student research for a course assign-ment. In a single class period, the librarian is expected to lead students into disciplin-ary resources relevant to their course assignment, demonstrate searching techniques, retrieve results relevant to their topics, boost students’ confidence, and take students directly into the beginning of the research process. Hence, “Instruction in bibliographic resources is useless unless wedded to a course project in which students are simultane-ously acquiring subject knowledge and direction from the professor and bibliographic skills from the librarian.”38 The course assignment is the origin of the faculty member’s expectation and the students’ motivation to learn library use and skills. The purpose of one-shot instruction is to help students become better prepared to adequately conduct research for course assignments. In this sense, student research readiness is the domain definition of the assessment for one-shot library instruction.

In a single class period, the librarian is expected to lead stu-dents into disciplinary resources relevant to their course assign-ment, demonstrate searching techniques, retrieve results relevant to their topics, boost students’ confidence, and take students directly into the begin-ning of the research process.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 8: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach626

The literature review for this study revealed that most assessments for one-shot library instruction focus on testing library skills. An argument for library skills-focused assessment is that it is natural to measure what has been taught because one-shot instruction covers use of library services and skills. However, this argument does not address the purpose of one-shot library instruction. As a reference librarian explained three decades ago, library skills are not equivalent to research in a disciplinary field:

One could teach any man-on-the-street about Library of Congress subject headings, catalog cards, the nature and structure of indexing/abstracting systems and other bibliographies, the mechanics of reading citations, and so on. That same man-on-the-street could then pass a library skills test with flying colors. But he would not then be qualified to do research in anthropology.39

Validity is also inseparable from theory. As Carmines and Zeller say, “Construct validation must be conceived of within a theoretical context.”40 The RRFA instrument

to measure research readiness in this study was constructed based on Kuhlthau’s ISP model.41 The model incorporates the three dimensions of cognitive (thoughts), affective (feelings), and physical (research) actions and divides the information search process into six stages: (1) task initiation, (2) topic selection, (3) prefocus exploration, (4) focus formulation, (6) informa-tion collection, and (6) search closure and pre-sentation. In the first three stages, information seekers’ actions frequently involve beginning the task, investigating general information and identifying topics, and experiencing feelings of uncertainty, apprehension, anxiety, and confu-

sion. The focus formulation stage is a turning point of the search process. After the topic becomes more focused and personalized, there is increased confidence and a sense of clarity. Feelings of relief with a sense of satisfaction eventually appear with a successful search process and completion of the task.

Later, information science researchers simplified Kuhlthau’s six stages into three categories: (1) prefocus, (2) focus, and (3) post-focus.42 Prefocus is the most critical stage for students to initiate their engagement with the course assignment before arriving at the focused stage. Because students have the least self-assurance at the beginning about their topics and information needs, they are vulnerable to developing false focus and “the fallacy of the perfect thirty-item online search.”43 According to Kuhlthau’s observation in one of her series of studies, 50 percent of the subjects showed no evidence of ever achiev-ing focus.44 One-shot library instruction takes place exactly at the critical prefocus stage.

The most common reason for faculty to request one-shot library instruction for their classes is for a course assignment. The assignment initiates a research task, triggers users’ emotions, and generates research actions. In the ISP model, the task stands at the center of all three dimensions. Information seekers’ thoughts, feelings, and actions change depending on how much of the task is completed. Because all needs and motivations

After the topic becomes more focused and personalized, there is increased confidence and a sense of clarity. Feelings of relief with a sense of satis-faction eventually appear with a successful search process and completion of the task.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 9: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 627

converge at the task, it is predictable that if one-shot library instruction is wedded with the course assignment, it will help students be better prepared for completing the as-signment. In other words, a successful library session will equip students with research skills and resources for their cognitive learning of course assignments, will reduce their emotional barriers, and will mobilize their research actions. Thus, encompassing the course assignments, research readiness is a concrete concept aligned with the three dimensions of the ISP.

To increase student research readiness at the prefocus stage, three components of one-shot library instruction derived from the ISP model are needed. First, students need to know how to access disciplinary research literature, use appropriate search strate-gies, and effectively find and retrieve relevant and significant sources at the beginning of the research. Ideally, the instruction should “get students into the primary literature as quickly as possible, for it is here that subject knowledge and scholarly guidance will be found.”45 Second, students need to be informed that the research process is nonlinear and might take them in a number of different directions.46 Students also need to be advised about the common problems they will encounter, and strategies and resources to handle these problems. Third, a one-shot session is not an isolated or stand-alone episode but a floating event to transfer the students’ previous library experience and skills to their present needs and to escort them into the next research stage. As Kuhlthau explained, “The active process of forming meaning from information is the task of the user in the ISP model.”47 The ISP model regards user experience as a personalized pro-cess. The more a personalized topic is developed, the more successful the search process becomes. Kuhlthau’s longitudinal study exhibits that, after the students became more experienced, they showed a sense of ownership by referring to the research process as “my process” and describing their research strategy as “this is the way I do it.”48 A one-shot session is an opportunity for a librarian to establish a connection to help students personalize their research in the later research stage. Therefore, the design of the RRFA instrument is to assemble the cognitive, affective, and behavioral dimensions integrating students’ previous experience and post-research activities into the survey questionnaire.

In the pre- and post-surveys, the cognitive dimension is measured by four items on how clearly students understand (1) their course assignment, (2) their research topics, (3) library research, and (4) where to look for sources for their assignment. The cognitive dimension was not stretched to specific course subjects or research topics in this study because they differ between individuals and classes. In fact, most one-shot library instruction sessions are discipline-specific, and they often come from disciplines in the social sciences. Psychology, sociology, and education are commonly listed as the top academic departments that request the most library instructional sessions, in addi-tion to English (composition classes).49 Empirical evidence showed that undergraduate social science majors engaged in more information-seeking behaviors and will more likely request assistance from reference librarians compared to physical science majors.50 The disciplinary nature of the social sciences is dynamic. According to the psychologist

A one-shot session is an opportunity for a librar-ian to establish a connec-tion to help students per-sonalize their research in the later research stage.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 10: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach628

Anthony Biglan, “The physical sciences are characterized by the existence of paradigms that specify the appropriate problems for study and the appropriate methods to be used. It appears that the social sciences and nonscience areas such as history do not have such clearly delineated paradigms.”51 As librarians observed in 1970s,

The social sciences are less stable, lack clear boundaries, and use imprecise terminology . . . New fields, such as futuristics or social policy, continually emerge in the social sciences, and disciplines overlap in such a way as to make the subject matter dependent for classification more on the author’s area of expertise than on the content.52

This is the main reason that the four items constructed for the cognitive dimension were not designated for specific subjects or research topics in the short instrument.

The affective dimension is defined by two items asking how greatly students feel challenged by their course assignment and how much confidence they have about completing the assignment successfully. The behavioral dimension includes five items asking when students will start their research and requesting that they rate the useful-ness of the library session. The behavioral dimension also includes how students seek research help. In one of Kuhlthau’s series of studies, 80 percent of the 26 students in the study responded that they “almost always” or “often” talk to others about their topic. However, Kuhlthau’s research found that most responses about student perceptions of getting help from librarians or teachers fell into the categories of “sometimes,” “seldom,” or “almost never.”53 Asking for help can be a sociological dimension in the information search process, although the sociological dimension is not defined in the ISP model. For this reason, the instrument places “asking for help” into the behavioral dimension. Table 1 displays how the coded items in the RRFA instrument define each dimension.

As Table 1 shows, two questions from the pre-survey investigate students’ previ-ous experience and preparations for their topics and research. The pre-survey not only functions as a baseline for comparing changes in student research readiness but also provides librarians “the opportunity to design relevant and authentic activities.”54 Such an investigation should be an important part of one-shot library instruction, as Andrea Brooks noted:

Approaching an information literacy session without pre-test is similar to walking into a classroom on the first day of the semester. The students and the instructor are strangers

Figure 1. Student readiness defined in the research readiness-focused assessment (RRFA) instrument.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 11: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 629

Table 1. The three dimensions of the information search process (ISP) model defined by question items

Dimensions Question items

Affective feelings How challenging is your class assignment to you? (scaled QI–6) How confident are you in completing your class assignment successfully?

(scaled QI–7))

Cognitive thoughts How clearly do you understand your class assignment? (scaled QI–2) How clear is your research topic in your mind? (scaled QI–3) How clearly do you know where you should look for information for your

research? (scaled QI–4)) How clearly do you understand the library research for your class

assignment? (scaled QI–5)

Physical actions When searching databases, you can utilize search techniques to maximize relevant results without losing a research focus. Please indicate which of

the following techniques can be used to narrow or expand the numbers of search results? (multiple choices)

If you run into a problem with your research, who are you most likely to ask for help? (scaled QI–8, 9,10)

When are you going to start your research for your class assignment? (multiple choices, scaled QI–11)

How useful was the library session for you? (scaled QI–1) Why do you think the library session was or was not useful to you? (text)

Previous experience/ Did you attend any library session/course to learn how to conduct preparation library research? (category) What research topic do you have for your class assignment now? (text) What resources are you mostly likely to use for research of your course

assignment? (texts)

Post-connection Who are you most likely to ask for help? (scaled QI–8, 9,10) When are you going to start your research for your class assignment?

(multiple choice/scaled QI–11)

*QI means question item, followed by the numbers of variables.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 12: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach630

to each other. In a semester-long class, an instructor will gain knowledge about his or her students and adopt lesson plans and approaches to fit the class needs. For librarians teaching a one-shot session, this is not an option. However, pre-testing helps make a connection with the class ahead of time.55

A common challenge of one-shot library instruction is redundancy.56 An observa-tion by Amalia Monroe-Gulick and Julie Petr illustrates such a problem. In their study, a student described typical repetitiveness:

I have had one, two, three, four. Four or five and I really think that one was more than enough because they would bring us in, and I mean it’s very important to know how to use the library, to know how to use the resources that are available to you. And not to offend anyone, it is an important time in the sun for librarians because they are so unappreciated in academia that, whenever they’re given an opportunity to—this has been my experience—to speak, to give the incredible training that they usually have had to do their jobs, they get very excited. And sometimes encourage professors to do it lots of times and once was very informative. The second time cleared up some questions that I had. Three and four made me again want to toss myself out the nearest window (Interview 4, 83).57

A pre-survey can help librarians adjust their instruction to avoid redundancy. The post-survey contains essentially the same questions as the pre-survey. One

unique open-ended question in the post-survey collects students’ comments about the one-shot session. The open-ended question58 and the question asking students to indicate their strategies or techniques and disciplinary sources are free-text answers or multiple choices (question 3 in Appendix A and question 4 in Appendix B). The rest of the items use a five-point Likert scale, asking re-spondents to express how much they agree or disagree with particular statements. Research readiness is hence constructed as a quantifiable concept to measure changes in students’

cognitive thoughts, affective feelings, and intended research actions for the course as-signment after one-shot library instruction.

Reliability

Reliability is generally accepted as a necessary precondition for validity.59 Cronbach’s alpha coefficient is a measure of internal consistency, which means the degree to which all the items in the instrument measure the same construct. The alpha coefficient for the 11 items in the post-survey was 0.69, nearly at the acceptable level of 0.70 for internal consistency in the social sciences. Two items, “Challenge [of] assignment” and “Net-work,” decreased the alpha value. First, the reverse coding of some items might have caused artifactual (nonsubstantive) problems with unidimensionality, the extent to which the instrument measured a single attribute at a time, because negatively worded items require people to reframe the statement. For example, “I dislike my job” and “I don’t like my job” might be answered slightly differently. In this case, because “Challenge [of] assignment” was next to “Confidence,” the reverse scale of “Challenge [of] assignment” might have affected some respondents. Second, “Network” proved more extraneous than

A pre-survey can help librarians ad-just their instruction to avoid redundancy.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 13: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 631

relevant to the internal consistency of the RRFA instrument. If “Network” is deleted, the alpha value increases to 0.724. (See the highlighted value in Table 2).

The author also separately calculated the alpha coefficients for items of the cogni-tive, affective, and behavioral dimensions (see Table 3), even though this study was not intended to test or confirm the ISP model, and the scale was developed to measure the single attribute of research readiness. The Cronbach’s alpha coefficient of the four items for the cognitive dimension was 0.803, suggesting that the items had high internal consistency. The alpha of the two items for the affective dimension was 0.259, and the alpha of the five items for the behavioral dimension was 0.418, suggesting that the items for the two dimensions had low internal consistency. A way to improve the alpha value is to add more items to measure the same dimension. As indicated in Jum Nun-nally’s Psychometric Theory (1978), “The primary way to make tests more reliable is to make them longer.”60 Future research might develop a longer three-factor RRFA instru-ment to measure the three dimensions of the ISP accurately and consistently. Nonethe-less, two main strengths of the instrument in the present study are its parsimony and unidimensionality.

Table 2.Reliability statistics: Cronbach’s alpha* if item deleted

Items Cronbach’s alpha if item deleted

Usefulness 0.663Clarity assignment 0.634Clarity topics 0.644Clarity sources 0.643Clarity library research 0.638Challenge assignment 0.716Confidence 0.661Ask professor 0.683Ask librarian 0.663Network 0.724When 0.679

Total items 0.690

* Cronbach’s alpha is a measure of internal consistency, the degree to which all the items in the instrument measure the same construct.

Two main strengths of the instrument in the present study are its parsimony and unidimensionality.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 14: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach632

Table 3.Reliability statistics: all items and items of each dimension of the research-readiness-focused assessment (RRFA) instrument

Cronbach’s alpha* Number of items

Total items 0.690 11Cognitive dimension 0.803 4Affective dimension 0.259 2Physical dimension 0.418 5

* Cronbach’s alpha is a measure of internal consistency, the degree to which all the items in the instrument measure the same construct.

Figure 2. The research readiness-focused assesment (RRFA) factor scree plot, showing how much variation in the data each factor can explain.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 15: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 633

To judge the validity of the instrument, the author used a set of statistical procedures called factor analysis. Factor analysis can be useful for determining the validity of empiri-cal measures, as Nunnally explained in 1978.61 Specifically, factor analysis can show that each of the components that informed the original scale development (in this case, the cognitive, affective, and behavioral dimensions) was reflected in the current measure. The purpose of this exploratory factor analysis was to demonstrate adequate coverage of the construct and to provide evidence for content validity, how well the test measured what it was intended to measure.62 The author used principal axis factoring to assess the dimensionality of the 11 items of the RRFA. Three factors (dimensions) were extracted explaining 54.61 percent of the variance. This was decided based on eigenvalues that were greater than 1, cumulative variance, and inspection of the scree plot (see Figure 2).

The pattern matrix (see Table 5) presents the three loading factors. The first factor includes five items. Four of the five items—clarity library research, clarity sources, clarity assignment, and clarity topics—were designed for the cognitive dimension. Although the factor analysis did not extract all three factors for the RRFA instrument, at least it partially captured one of the three dimensions—cognitive thoughts, demonstrating po-tential for the new instrument to be improved in the future. Therefore, this study used all 11 items to gather the data, analyze, and interpret the results instead of focusing on each dimension.

Table 4. Pattern matrix

Factor 1 2 3

clarity library research 0.847 clarity sources 0.830 clarity assignment- 0.744 clarity topics 0.557 confidence 0.420 ask librarian 0.844 usefulness 0.409 network challenge assignment 0.523when ask professor

Extraction method: principal axis factoring. Rotation method: oblimin with Kaiser normalization. Rotation converged in eight iterations.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 16: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach634

Participants

The participants were the 240 students who attended the 10 one-shot sessions in the spring semester of 2014. There were 184 responses to the pre-survey, for an overall response rate of 77 percent. The post-survey had 129 responses for a rate of 54 percent, with relatively high variations of response rates for individual questions. The 10 sessions were given to classes at the undergraduate level, including four social work classes, two sociology classes, two psychology classes, one anthropology class, and one English composition class. The instruction librarian normally conducts library training for English composi-tion courses. In this case, however, the English composition class asked the social sci-

ences librarian (the author of this present study) to conduct the one-shot session because the students needed to learn to search the iPOLL Databank, a database of public opinion surveys, to incorporate statistical information in writing their papers. The course assignments for the 10 classes varied but were all related to writing papers within the disciplines. The author of this study conducted all 10 one-shot sessions. She designed the instruction for each session differently depending on the course assignments and on professors’ expectations and requirements, but the teaching was always based on the three

components mentioned in the previous section of this report on “Validity.”

Procedure

The Institutional Review Board (IRB) granted an exemption for this study. The consent letter and the Web links for surveys were e-mailed to the faculty members who ad-ministered the surveys to the students. The pre-survey was administered in the class before the one-shot session, and the post-survey was given to the students in the next class after the one-shot session. Timing is important to collect data efficiently. If the pre-survey comes too early, the faculty and the students might not be ready to pay attention to the upcoming library session. If the post-survey is administered too late, faculty and students might have moved on to their next course topic, and their memories from the one-shot session might have faded.

Two selling points encouraged faculty and students’ participation: Each survey is short and takes only about five minutes to complete; and information collected from the pre-surveys can also benefit faculty. Faculty members were interested in knowing how their students prepared for conducting research for their course assignment. No incentive (such as extra credit or gifts) was given to students for taking the surveys.

Data were collected from each course separately in the spring semester of 2014. Because the RRFA instrument focused on measuring the perceived changes of student cognitive thoughts, affective feelings, and research actions, and was not designed for a

If the pre-survey comes too early, the faculty and the students might not be ready to pay attention to the upcoming library session. If the post-survey is administered too late, faculty and students might have moved on to their next course topic, and their memories from the one-shot session might have faded.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 17: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 635

particular discipline or course, this study merged all data files from the 10 sessions into two related large groups: the student responses before and after the library instruction. Data were entered in SPSS, a software package used for statistical analysis. The author used the paired sample t-test to determine whether the changes in student research readiness were statistically significant. This statistical method is appropriate for pretest/posttest situations because it repeatedly measures the same subjects over the course of a treatment or time, and it examines the mean of individual differences of the paired samples. Paired samples mean that each individual’s responses in the pre-survey should have been matched to the same individual’s responses in the post-survey. However, this study could not pair individual responses due to a lack of respondents’ identifications. Librarians Kate Zoellner, Sue Samson, and Samantha Hines reported a similar problem in their article “Continuing Assessment of Library Instruction to Undergraduates” in 2008.63 Zoellner, Samson, and Hines went ahead with their test because they believed that “the sample does meet the assumption of normality, enabling generalizations across the population.” Although the sample size (pre-survey N = 184, post-survey N = 129) of the present study was not as large as that reported in 2008 (pretest N = 214, posttest N = 212), it seemed large enough. Because of the similarity to the three librarians’ study in 2008, this present study continued to run the paired sample t-test for analyzing the data collected in the spring of 2014. This study also used the signed test to compare the results of the paired sample t-test. The signed test is an alternative of the paired sample t-test to check for any difference between two populations that requires few assump-tions of normal distribution, and when the variables are measured on ordinal scales.64

Results

After the data were merged into two large groups from the pre-survey and post-survey, the author retrieved and coded 11 variables (see Table 1). These variables were all mea-sured on a five-point Likert scale, except QI–11, which had five multiple choices: (1) “two days before the assignment is due,” (2) “one week before the assignment is due,” (3) “two weeks for the assignment is due,” (4) “right after the library session,” and (5) “I have started it.” The text responses were manually scaled from the latest date to the earliest date as 1 to 5.

Table 5 provides the results of the paired sample t-test and the signed test, which includes the means of the pre- and posttests; z scores—a test statistic that measures the difference between an observed statistic and its hypothesized population parameter; and p values, probabilities that measure the evidence against the null hypothesis.65 With the confidence interval at 95 percent, this study used 0.05 as the threshold of statisti-cal significance. All 11 variables showed positive changes in the expected directions. Eight items in the paired sample t-test and seven items in the signed test demonstrated statistically significant differences from the pre-survey to the post-survey. The two tests yielded almost identical results, except one item, Q–I6, “Challenge [of] assignment.”

Research hypothesis 1—“After one-shot library instruction, stu-dents’ cognitive thinking would improve related to conducting research for their course assign-ment”—was supported.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 18: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach636

Research hypothesis 1—“After one-shot library instruction, students’ cognitive think-ing would improve related to conducting research for their course assignment”—was supported because the means and medians of the four variables that measured cognitive thoughts showed statistically significant increases from the pre-survey to the post-survey. The mean and median of student feelings about the challenge of their course assignments dropped significantly in the post-survey. The mean and median of students’ confidence in the post-survey increased, but the increase was not large enough to be statistically significant to retain research hypothesis 2—“After one-shot library instruction, students would feel that the course assignment was less challenging, and they would have more confidence about completing course assignments successfully.” Similarly, the increases of the means and medians for QI–9, “Ask librarian,” and QI–11, “When,” were statisti-cally significant, but the increase of QI–8, “Ask professor,” and the decrease of QI–10,

Table 5. Results of paired sample t–test and signed test

Variables Paired sample t–test Signed test Pre- Pre- p value Z Asymp. survey survey Sig. Sig. (mean) (mean) (2 tailed) (2–tailed)

Usefulness of instruction 3.67 4.15 0.000* –2.854 0.004*

Clarity of course 2.98 3.67 0.000* –4.1 0.000* assignment

Clarity of research topic 2.79 3.33 0.000* –3.98 0.000*

Clarity for finding sources 2.94 4.01 0.000* –5.536 0.000*

Clarity of library research 2.90 3.90 0.000* –6.109 0.000*

Challenge of course 3.11 2.86 0.049* –1.735 0.083 assignment

Confidence to complete 4.06 4.11 0.609 –1.575 0.115 assignment

Ask professor for help 4.24 4.29 0.671 –0.434 0.664

Ask librarian for help 3.16 3.95 0.000* –3.733 0.000*

Ask network for help 3.17 2.98 0.254 –1.547 0.112

When start to research 3.30 3.92 0.000* –5.414 0.000*

*p value < 0.05.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 19: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 637

“Network,” were not large enough to support research hypothesis 3—“After one-shot library instruction, students would be more willing to ask the librarian for help over their social networks, would plan to start their research earlier, and would appreciate the one-shot library session more.”

Discussion

By using the RRFA instrument, this study, in general, demonstrated improvement in students’ research readiness after receiving one-shot library instruction at the begin-ning of their research. The improvements in the cognitive dimension were statistically significant. After one-shot library instruction, the students understood more about their assign-ments, and they were clearer about their research topics, about where to look for information to develop their topics, and about library research. The students also changed psychologically: they considered the course assignments less challeng-ing, and they felt more confidence about their ability to complete the assignments successfully. After one-shot instruction, the students showed statistically significant changes in their willing-ness to ask the librarian for help over their social networks, a likelihood to start their research earlier, and increased appreciation for the usefulness of the library session. The largest increases of the means and medians among the 11 variables, “Clarity for sources” (from pre-survey 2.94 to post-survey 4.01), and “Clarity for library research” (from pre-survey 2.90 to post-survey 3.90), indicate that the students learned the most about finding sources and conducting library research for their assignments in the one-shot sessions.

Even though the changes in the three variables “Network” (decreased), “Ask profes-sor” (increased), and “Confidence” (increased) were not statistically significant, these moderate changes still imply the improvements in the students’ research readiness. How should we explain these moderate changes? “Network” was defined as social networks including classmates, friends, parents, and social media, and was used to compare the changes in “Ask librarian.” Apparently, the students felt more comfortable and thought it more convenient to ask people around them, even though there was a large increase in the likelihood that they would ask the librarian for help after one-shot library instruc-tion. The two variables “Confidence” and “Ask professor” had the highest mean scores in the pre-survey (4.06 and 4.24 out of 5), but they both had the smallest gains (0.05) in the post-survey. Seemingly, compared with other sources of help, the students remained most likely to work with their professors for their course assignments. The insignificant gain in student confidence echoes the findings in previous studies about a tendency for student overconfidence. Two studies reported by Amy Gustavson and H. Clark Nall in 2011, and by Melissa Gross and Don Latham in 2012, observed that first-year college students tended to rate their confidence levels higher than their information literacy skills.66 This present study also showed such a tendency in undergraduate students

After one-shot library instruc-tion, the students understood more about their assignments, and they were clearer about their research topics, about where to look for information to develop their topics, and about library research.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 20: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach638

across all levels; they were inclined to rate their confidence in successfully completing their course assignments high prior to starting their research. Such overconfidence could result from false optimism because the students did not know what they did not know. In other words, they did not anticipate difficulties they would encounter until they actually began their research. False optimism could worsen students’ psychological experience, leading to apprehension, anxiety, confusion, and frustration, which commonly occur at the prefocus stage. For this study, the students’ false optimism not only explains the smaller gain in confidence but also indicates that one-shot library instruction can help students be more realistically prepared to deal with potential research difficulties.

Although the RRFA instrument has flaws, this study shows that it measures and identifies the instant improvement of student research readiness after receiving one-shot instruction in the critical prefocus research stage. The RRFA’s success comes mostly from the defined assessment domain research readiness, which is theoretically supported by the ISP model and by the validation process in the present study, so that research readiness is quantifiable and interpretable. The time and process needed for analyzing and interpret-ing data are shortened and simplified. However, the investigator could not quantify all items in RRFA instrument. Two items were originally designed to identify the learning outcomes of using search strategies and disciplinary sources for the behavioral dimen-sion of the ISP. These two items were:

1. When searching databases, utilizing search techniques can maximize relevant results without losing a research focus. Please indicate which of the following techniques can be used to narrow or expand numbers of search results.

2. List a database that indexes [disciplinary] journal articles.

Because these items were multiple choices and texted responses, they were difficult to quantify and so were omitted. Without including the learning outcomes of using search strategies and disciplinary resources, the ability to measure the behavioral dimension of the RRFA was weakened.

The effort to make the surveys short paid off. Most students took about five to ten minutes to complete each questionnaire. The shortened instrument evidently attracted more participants and contributed to higher response rates (77 percent in the pre-survey and 54 percent in the post-survey). With limited items, however, it is not possible to fully measure all three dimensions of the RRFA. Nevertheless, the short instrument can make assessment for one-shot library instruction sustainable.

In this study, the immediate impact of one-shot library instruction on student re-search readiness was largely isolated from the possible outside influences of professors’ teaching and students’ outside self-learning by controlling the timing of the surveys. The pre-survey was administered in the class prior to the library session, while the post-survey was given in the next class soon after the one-shot session. To administer surveys twice to the same group within a short time is challenging because assessment is “the last part of the teaching cycle and the one that no one has time for.”67 The timing of the surveys in two class sessions depended on the faculty’s support. Collecting data in a timely fashion also imposed a challenge with the students. According to the faculty, some students felt they were taking the same survey twice because of the matching items in the two questionnaires. One way to minimize the influence of the pre-survey

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 21: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 639

on the post-survey is to change the order and wording of question items. The number of respondents in the post-survey (N = 129) was smaller than that in the pre-survey (N = 184). The challenges from professors’ willingness to help administer two surveys in a short period and the similarity of the surveys could be the major causes of the dropped respondents in the post-survey.

Although the RRFA used timing to eliminate the impact of professors’ assistance and student self-learning, other factors might jointly influence or intervene in student’s learning in the one-shot session—for example, the students’ previous library experiences. As mentioned previously, the pre-survey included this question, “Did you attend any library session/course to learn how to conduct library research?” The purpose of this question was, to some extent, to avoid redundan-cies by knowing how many students had library instruction previously. A total of 193 students answered this question, of whom 112 (58 percent) had previous library training in a range of courses including a credit-bearing library course, English composition, social sciences, sciences, and humanities. Eighty-one students (42 percent) had no library exposure. It is thus possible to identify the relationship between students’ previous library experience and their research readiness in the future. The results of this study, however, suggest that one-shot library instruction can improve students’ research readiness across both populations.

There are various ways to refine the RRFA instrument. For example, after accumu-lating and benchmarking data over time, the RRFA instrument could be reduced to a post-only survey. The post-only survey could add more open-ended questions to collect student texted responses. The texts will provide richer data to define and analyze research readiness qualitatively. Furthermore, the RRFA items could be changed depending on different course assignments, and new variables could be developed that better measure the three dimensions.

Assessment is an important part of instruction because it both guides and advocates instruction. The newly developed RRFA instrument is not a perfect tool, but it opens unlimited possibilities for applying the conceptual approach to assessment of one-shot library instruction. The concept of research readiness seems to fit well with the threshold concept described in the Framework for Information Literacy for Higher Education (Framework). The fit is not as good with the older Standards, which ignore “the vital aspect of attitudes, emotion.”68 As Megan Oakleaf observed, “At first glance, [Jan] Meyer and [Ray] Land [originators of threshold concept], do not appear to support pedagogy or assessment based on learning outcomes . . . because, they say, it’s impossible to adequately describe a learning goal to students who haven’t yet achieved that goal.”69 Oakleaf also pointed out that, nevertheless, Meyer and Land recognized a need for assessment. In fact, they anticipated what they call

In this study, the immediate impact of one-shot library instruction on student research readiness was large-ly isolated from the possible outside influences of professors’ teaching and students’ outside self-learning by controlling the timing of the surveys.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 22: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach640

new modes of mapping, representing and forming estimations of students’ conceptual formation . . . a rich feedback environment offered at the point of conceptual difficulty (“stuckness,” the liminal state) as well as in the pre-, post- and subliminal states . . . a more nuanced discourse to clarify variation and experience and achievement through the various stages of the liminal journey . . . the possibility of an ontological (as well as conceptual) dimension of assessment.70

Oakleaf, thus, concluded:

In fact, threshold concepts are very well suited to learning outcomes assessment, as long as the assessments permit the use of authentic assessment approaches, provide useful feedback to students to help them over the “stuck places,” emphasize individual variation in the journey that students travel to achieve them, recognize that learners may redefine their sense of self.71

In this sense, the RRFA that measures student research readiness in the liminal or tran-sitional state of their nonlinear research process can be considered a departure from the Standards and a move in the direction of the Framework in the assessment of one-shot library instruction.

Acknowledgments

The author offers deepest thanks to Kim O’Brien, professor of psychology at Central Michigan University in Mount Pleasant, for her guidance and review for the statistical methods of this report; to Priscilla Seaman, reference and instruction librarian at the University of Tennessee Chattanooga; and to Katherine Mason, digital resources librarian, and Elizabeth Macleod, professor emerita and music librarian, both at Central Michigan University, for their thorough reviews and proofreading.

Rui Wang is a social sciences librarian and associate professor at Central Michigan University Libraries in Mount Pleasant; she may be reached by e-mail at: [email protected].

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 23: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 641

Appendix 1

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 24: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach642

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 25: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 643

Appendix 2

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 26: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach644

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 27: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 645

Notes

1. Deborah L. Lockwood, Library Instruction: A Bibliography (Westport, CN: Greenwood, 1979), 8.

2. Ibid., 7. 3. Lynne M. Martin and Trudi E. Jacobson, “Reflections on Maturity,” introduction to “Library

Instruction Revisited: Bibliographic Instruction Comes of Age,” Reference Librarian 24, 51–52 (September 1995): 5–13.

4. Primary Research Group, College Information Literacy Efforts Benchmarks (New York: Primary Research Group, 2014), 38; Information Literacy Efforts Benchmarks, 2013 Edition (New York: Primary Research Group, 2014), 30.

5. Sarah Polkinghorne and Shauna Wilton, “Research Is a Verb: Exploring a New Information Literacy-Embedded Undergraduate Research Methods Course,” Canadian Journal of Information and Library Science 34, 4 (December 2010): 457–73.

6. Joyce Lindstrom and Diana D. Shonrock, “Faculty-Librarian Collaboration to Achieve Integration of Information Literacy,” Reference & User Services Quarterly 46, 1 (Fall 2006): 18–23.

7. Megan Oakleaf, Steven Hoover, Beth S. Woodard, Jennifer Corbin, Randy Hensley, Diana K. Wakimoto, Christopher V. Hollister, Debra Gilchrist, Michelle Millet, and Patricia A. Iannuzzi, “Notes from the Field: 10 Short Lessons on One-Shot Instruction,” Communications in Information Literacy 6, 1 (2012): 5–23.

8. Janice M. Jaguszewski and Karen Williams, New Roles for New Times: Transforming Liaison Roles in Research Libraries (Washington, DC: Association of Research Libraries, 2013), 7.

9. Julie Rabine and Catherine Cardwell, “Start Making Sense: Practical Approaches to Outcomes Assessment for Libraries,” Research Strategies 17, 4 (2000): 319–35.

10. Jacalyn E. Bryan and Elana Karshmer, “Assessment in the One-Shot Session: Using Pre- and Post-Tests to Measure Innovative Instructional Strategies among First-Year Students,” College & Research Libraries 74, 6 (November 2013): 574–86.

11. Donald A. Barclay, “Evaluating Library Instruction: Doing the Best You Can with What You Have,” RQ 33, 2 (Winter 1993): 195–202; Rabine and Cardwell, “Start Making Sense.”

12. Rabine and Cardwell, “Start Making Sense.” 13. Rachel Applegate, “Faculty Information Assignments: A Longitudinal Examination of

Variations in Survey Results,” Journal of Academic Librarianship 32, 4 (July 2006): 355–63.14. Bonnie J. M. Swoger, “Closing the Assessment Loop Using Pre- and Post-Assessment,”

Reference Services Review 39, 2 (May 2011): 244–59. 15. Rabine and Cardwell, “Start Making Sense.” 16. Association of College and Research Libraries (ACRL), Framework for Information

Literacy for Higher Education, draft 1, part 1 (February 2014), http://acrl.ala.org/ilstandards/wp-content/uploads/2014/02/Framework-for-IL-for-HE-Draft-1-Part-1.pdf.

17. Nancy Wootton Colborn and Rosanne M. Cordell, “Moving from Subjective to Objective Assessment of Your Instruction Program,” Reference Services Review 26, 3–4 (1998): 125–37; Chris A. Portmann and Adrienne Julius Roush, “Assessing the Effects of Library Instruction,” Journal of Academic Librarianship 30, 6 (November 2004): 461–65; Kevin W. Walker and Michael Pearce, “Student Engagement in One-Shot Library Instruction,” Journal of Academic Librarianship 40, 3–4 (May 2014): 281–90.

18. Barclay, “Evaluating Library Instruction.” 19. Mary Reichel, “Library Literacy,” RQ 33, 2 (Winter 1993): 195. 20. Colborn and Cordell, “Moving from Subjective to Objective Assessment of Your Instruction

Program.” 21. Ma Lei Hsieh and Hugh A. Holden, “The Effectiveness of a University’s Single-Session

Information Literacy Instruction,” Reference Services Review 38, 3 (2010): 458–73. 22. Walker and Pearce, “Student Engagement in One–Shot Library Instruction.” 23. Richard Hume Werking, “Evaluating Bibliographic Education: A Review and Critique,”

Library Trends 29, 1 (1980): 153–72.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 28: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach646

24. Gabrielle Wong, Diana Chan, and Sam Chu, “Assessing the Enduring Impact of Library Instruction Programs,” Journal of Academic Librarianship 32, 4 (July 2006): 384–95.

25. Elizabeth Choinski and Michelle Emanuel, “The One-Minute Paper and the One-Hour Class: Outcomes Assessment for One-Shot Library Instruction,” Reference Services Review 34, 1 (February 2006): 148–55.

26. Amy Dykeman and Barbara King, “Term Paper Analysis: A Proposal for Evaluating Bibliographic Instruction,” Research Strategies 1, 1 (1983): 14–21; David F. Kohl and Lizabeth A. Wilson, “Effectiveness of Course-Integrated Bibliographic Instruction in Improving Coursework,” RQ 26, 2 (January 1986): 206–11; Linda G. Ackerson and Virginia E. Young, “Evaluating the Impact of Library Instruction Methods on the Quality of Student Research,” Research Strategies 12, 3 (1994): 132–44.

27. Alison Paglia and Annie Donahue, “Collaboration Works: Integrating Information Competencies into the Psychology Curricula,” Reference Services Review 31, 4 (2003): 320–28.

28. Mary Jane Brustman and Deborah Bernnard, “Information Literacy for Social Workers: University at Albany Libraries Prepare MSW Students for Research and Practice,” Communications in Information Literacy 1, 2 (2007): 89–101.

29. Portmann and Roush, “Assessing the Effects of Library Instruction.” 30. Christopher Bober, Sonia Poulin, and Luigina Vileno, “Evaluating Library Instruction in

Academic Libraries: A Critical Review of the Literature, 1980–1993,” Reference Librarian 24, 51–52 (1995): 53–71.

31. Barclay, “Evaluating Library Instruction.”32. David B. Sawyer, Fundamental Aspects of Interpreter Education Curriculum and Assessment

(Amsterdam, Neth.: John Benjamins, 2004), 95. 33. Edward G. Carmines and Richard A. Zeller, Reliability and Validity Assessment (Beverly

Hills, CA: Sage, 1979).34. Stephen G. Sireci and Tia Sukin, “Test Validity,” vol. 1, Test Theory and Testing and

Assessment in Industrial and Organizational Psychology, ed. Kurt F. Geisinger, Bruce A. Bracken, Janet F. Carlson, Jo-Ida C. Hansen, Nathan R. Kuncel, Steven P. Reise, and Michael C. Rodriguez (Washington, DC: American Psychological Association, 2013), 64.

35. Carmines and Zeller, Reliability and Validity Assessment, 23.36. Ibid., 65.37. Larry L. Hardesty, Jamie Hastreiter, David Henderson, and Evan Ira Farber, Bibliographic

Instruction in Practice: A Tribute to the Legacy of Evan Ira Farber (Ann Arbor, MI: Pierian, 1993) 5.

38. Stephen K. Stoan, “Research and Library Skills: An Analysis and Interpretation,” College & Research Libraries 45, 2 (1984): 99–109.

39. Ibid.40. Carmines and Zeller, Reliability and Validity Assessment, 23.41. Carol Collier Kuhlthau, “Inside the Search Process: Information Seeking from the User’s

Perspective,” Journal of the American Society for Information Science 42, 5 (June 1991): 361–71.42. Lynn Kennedy, Charles Cole, and Susan Carter, “The False Focus in Online Searching:

The Particular Case of Undergraduates Seeking Information for Course Assignments in the Humanities and Social Sciences,” Reference & User Services Quarterly 38, 3 (April 1999): 267–73.

43. Ibid. False focus is a situation in which a user gets a focus too soon in the prefocus stage. As Kennedy, Cole, and Carter described, if the student is to zero in on a topic and avoid information overload, or if a librarian pushes the student toward a topic inappropriately or too soon, “The student might try to cut out the prefocus phase of the search process entirely.” Marcia J. Bates, “The Fallacy of the Perfect Thirty-Item Online Search,” RQ 24, 1 (1984): 43–50. As Bates analyzed, the fallacy occurs when one assumes that he or she can produce “a high-quality thirty-item output.” The searcher may inappropriately modify the “wrong” size output or stop immediately with the “right” size output.

44. Kuhlthau, “Inside the Search Process.”

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 29: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Rui Wang 647

45. Stoan, “Research and Library Skills.” 46. A number of scholars have observed the nonlinear research process. For example, Stoan,

in “Research and Library Skills,” 102, summarized what University of Bath investigators in the United Kingdom discovered: “The research process is an extremely complex and personal one that cannot easily be defined or fit into mechanistic search strategy.” Stoan cited the finding of Maurice Line, the investigator at the University of the Bath:

The chronological order of each stage cannot be predetermined, for they vary with the individual researchers’ preference for organizing his work. Research is a process that does not allow for too formal organization . . . Serendipity plays an important role in research, and information that a researcher comes across merely by chance may cause him to channel his work along new lines.

Stoan also referred to the behavioral science philosopher Abraham Kaplan’s analysis to describe researchers’ nonlinear research process:

A new idea generated from one source, an original insight springing from other, may later the direction of the quest and the kind of material being sought. What is needed next will be dictated by the intellectual evolution of the researcher up to that point. The final product of a research project may even be very different from what the investigator envisioned at the outset. In these circumstances, there can be no pat number of predetermined sources that the researcher will consult.

47. Kuhlthau, “Inside the Search Process.” 48. Ibid.49. Primary Research Group, Information Literacy Efforts Benchmarks, 39.50. Ethelene Whitmire, “Disciplinary Differences and Undergraduates’ Information-Seeking

Behavior,” Journal of the American Society for Information Science and Technology 53, 8 (June 2002): 631–38.

51. Anthony Biglan, “The Characteristics of Subject Matter in Different Academic Areas,” Journal of Applied Psychology 57, 3 (1973): 195–203, doi:http://dx.doi.org/10.1037/h0034701.

52. Patricia Stenstrom and Ruth B. McBride, “Serial Use by Social Science Faculty: A Survey,” College and Research Libraries 40, 5 (1979): 426–31.

53. Carol Collier Kuhlthau, “Perceptions of the Information Search Process in Libraries: A Study of Changes from High School through College,” Information Processing & Management 24, 4 (January 1988): 419–27.

54. Andrea Brooks, “Maximizing One-Shot Impact: Using Pre-Test Responses in the Information Literacy Classroom,” Southeastern Librarian 61, 1 (Spring 2013): 41–43.

55. Ibid.56. Evan Farber, “Bibliographic Instruction at Earlham College,” in Hardesty, Hastreiter,

Henderson, and Farber, Bibliographic Instruction in Practice, 6.57. Amalia Monroe-Gulick and Julie Petr, “Incoming Graduate Students in the Social Sciences:

How Much Do They Really Know about Library Research?” portal: Libraries and the Academy 12, 3 (July 2012): 315–35.

58. The open-ended question is not included in Appendix A or B because it was not used in the research.

59. Kurt F. Geisinger, “Reliability,” in APA Handbook of Testing and Assessment in Psychology, Geisinger, Bracken, Carlson, Hansen, Kuncel, Reise, and Rodriguez, 21–42.

60. Jum C. Nunnally, Psychometric Theory, 2d ed., McGraw-Hill Series in Psychology (New York: McGraw-Hill, 1978), 243.

61. Ibid., 62–63.62. Stephen N. Haynes, David C. S. Richard, and Edward S. Kubany, “Content Validity in

Psychological Assessment: A Functional Approach to Concepts and Methods,” Psychological Assessment 7, 3 (1995): 238–47.

63. Kate Zoellner, Sue Samson, and Samantha Hines, “Continuing Assessment of Library Instruction to Undergraduates: A General Education Course Survey Research Project,” College & Research Libraries 69, 4 (2008): 370–83.

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.

Page 30: Assessment for One-Shot Library Instruction: A Conceptual ...The library literature has some reports of assessment for one-shot library instruction, although the number of such reports

Library Instruction: A Conceptual Approach648

64. Yadolah Dodge, The Concise Encyclopedia of Statistics (New York: Springer, 2008), 376–77, 571–74.

65. Minitab.com, Minitab 17 Support, http://support.minitab.com/en-us/minitab/17/.66. Amy Gustavson and H. Clark Nall, “Freshman Overconfidence and Library Research

Skills: A Troubling Relationship?” College & Undergraduate Libraries 18, 4 (2011): 291–306; Melissa Gross and Don Latham, “What’s Skill Got to Do with It? Information Literacy Skills and Self-Views of Ability among First-Year College Students,” Journal of the American Society for Information Science and Technology 63, 3 (2012): 574–83.

67. Oakleaf, Hoover, Woodard, Corbin, Hensley, Wakimoto, Hollister, Gilchrist, Millet, and Iannuzzi, “Notes from the Field.”

68. ACRL, Framework for Information Literacy for Higher Education, draft 1, part 1, 3.69. Megan Oakleaf, “A Roadmap for Assessing Student Learning Using the New Framework

for Information Literacy for Higher Education,” Journal of Academic Librarianship 40, 5 (2014): 510–14; http://meganoakleaf.info/framework.pdf.

70 Ray Land and Jan H. F. Meyer, “Threshold Concepts and Troublesome Knowledge (5): Dynamics of Assessment,” in Threshold Concepts and Transformational Learning, ed. Jan H. F. Meyer, Ray Land, and Caroline Baillie (Rotterdam, Neth.: Sense, 2010), 76–77.

71. Megan Oakleaf, “A Roadmap for Assessing Student Learning Using the New Framework for Information Literacy for Higher Education.”

This m

ss. is

peer

review

ed, c

opy e

dited

, and

acce

pted f

or pu

blica

tion,

porta

l 16.3

.