Anderson, R. P., & Wilson, S. P. (2009). Quantifying the effectiveness of interactive tutorials in medical library instruction. Medical Reference Services Quarterly, 28(1), 10–21. doi: 10.1080/02763860802615815
Anderson and Wilson aim to fill a gap in research by not only studying whether active versus passive learning tutorials have an effect on student learning but also whether users prefer active versus passive tutorials. The study only tested three learning outcomes, and for two of the three, there was no statistical difference between the performances of students taking the active versus passive tutorials; however, students overwhelmingly voiced the opinion that they prefer active-learning tutorials.
Bobish, G. (2011). Participation and pedagogy: Connecting the social web to ACRL learning outcomes. Journal of Academic Librarianship, 37(1), 54–63. doi:10.1016/j.acalib.2010.10.007
After a brief explanation of the link between constructivism and information literacy, Bobish relates Web. 2.0 tools with constructivism and touches on how these tools and this approach can enrich the teaching and learning of information literacy skills. He goes on to list each of ACRL’s Information Literacy Competency Standards for Higher Education along with each standard’s performance indicators and outcomes. Each outcome is matched with one or more of five Web 2.0 tools (blogs, media sharing, social bookmarking, social networking, and wikis) along with suggestions for incorporating the suggested tools into lessons.
Castonguay, R. (2008). Assessing library instruction through web usability and vocabulary studies. Journal of Web Librarianship 2, 429–455. doi: 10.1080/19322900802190753
Castonguay reports on a series of Web usability studies of a community college library’s web presence. By adopting techniques for task design from the field of human-computer interaction/usability studies, the author establishes correlations between the level of students’ previous library instruction and their performance on usability tasks. The study also demonstrates a positive correlation between library instruction and successful completion of information retrieval tasks. It provides ideas for assessment project design in which usability tasks are used to understand relationships between research performance and varied approaches to library instruction.
Friehs, C. G., & Craig, C. L. (2008). Assessing the effectiveness of online library instruction with finance students. Journal of Web Librarianship 2(4), 493–509. doi: 10.1080/19322900802484438
The authors use survey methods to answer questions about whether university finance students find Camtasia-produced tutorials for online databases useful. This straight-forward study establishes that such tutorials are a valuable and useful form of library instruction. Correlations are found between the goals of the students and their perceived relevance of the resources. The article communicates a clear research design rooted in widely accepted survey methods. This article is a model for designing surveys that implement Likert scales and correlate results to biographical data.
Germek, G. (2012). Empowered Library eLearning: Capturing assessment and reporting with ease, efficiency, and effectiveness. Reference Services Review, 40(1), 90–102. http://dx.doi.org/10.1108/00907321211203658
Web-based library instruction is delivered increasingly through online tutorials. In many cases it is difficult to assess their effectiveness unless they are housed in a learning management system that offers instant scoring and electronic data archiving. Germek describes how to create and evaluate online tutorials through Adobe Captivate and Connect. Following his steps, Adobe becomes a library eLearning platform that is evaluated and updated based on usage reporting. This article is a valuable guide on how to build and assess web-based instruction through Adobe tutorials.
Hensley, M. K., & Miller, R. E. (2010). Listening from a distance: A survey of University of Illinois distance learners and its implications for meaningful instruction. Journal of Library Administration, 50 (5–6), 670–683. doi: 10.1080/01930826.2010.488946
In a 2009 survey at the University of Illinois at Urbana-Champaign, distance learners were asked about their perceptions and use of the library. Hensley and Miller describe how the survey was designed and written in order to gain meaningful student feedback. Their survey results and how they were used to evaluate library instruction are explained. This article gives helpful guidelines for soliciting meaningful evaluations from students. Librarians teaching online students will also benefit from the survey results that are a current snapshot of distance learners’ impressions of the library.
Hillyer, N., Maring, M., & Richards, D. (2008). Assessment in small bytes: Creating an online instrument to measure information literacy skills. In T. P. Mackey, & T. E. Jacobson (Eds.), Using technology to teach information literacy (pp. 165–192). New York: Neal-Schuman Publishers.
The University of Nebraska–Omaha library, in collaboration with the university’s English department, designed an online assessment instrument delivered through Blackboard to measure students’ mastery of information literacy skills before and after face-to-face library instruction delivered to first-year English classes. Students were asked to complete the post-instruction questionnaire within 2 weeks of the last library instruction session. The questions in the pre- and post-instruction questionnaires were identical and were mapped to standards 1,2, 3, and 5 of ACRL’s Information Literacy Competency Standards for Higher Education. The post-instruction questionnaire scores were higher than pre-instruction questionnaires. The authors also identified that questions mapped to concepts that were taught with active learning exercises showed the largest increase in correct answers. The study’s methodology and challenges are well explained, but those looking for ways to assess online learning instead of face-to-face learning may find only limited parts of the study helpful.
Hufford, J. R. & Paskel, A. K. (2010). Pre-and Postassessment surveys for the distance section of LIBR 1100, Introduction to Library Research. Journal of Library Administration, 50, 693–711. doi:10.1080/01930826.2010.488956
In the fall of 2009, librarians at Texas Tech University Libraries decide to undertake pre- and post-assessment surveys of their online section of LIBR 1100 not only to determine objectively if their students were indeed learning what was taught, but also to share their assessment experience in an area that is lacking in the professional literature. Findings from their assessment of the first cohort in 2009 were strongly positive, indicating that their online students improved their initial assessment scores significantly. Lessons learned include the importance of meeting learning outcomes through assessment and how assessing students helps instructors structure courses so that learning can be continually improved each time the course is taught.
Kontos, F., & Henkel, H. (2008). Live instruction for distance students: Development of synchronous online workshops. Public Services Quarterly 4(1), 1–14. doi: 10.1080/15228950802135657
This case study describes the use of Blackboard Wimba to produce synchronous library instructional sessions. The authors report on several considerations which other librarians may find relevant when working to replicate similar designs. Among these considerations is the expectation of low student attendance rates, although this may be tempered by increased participation by faculty. This study provides evidence that there is a market, particularly among younger students, for synchronous online approaches to instruction.
Lavoie, D., Rosman, A., & Sharma, S. (2011). Information literacy by design: Recalibrating graduate professional asynchronous online programs. In T. P. Mackey, & T. E. Jacobson (Eds.), Teaching information literacy online (pp. 133–158). New York: Neal-Schuman Publishers.
At the University of Connecticut, librarians turned the creation of an online master of science in accounting degree into an opportunity to fully integrate information literacy into the program’s curriculum. A team consisting of an instructional designer, librarian, professor, and a media specialist approached this challenge using a constructivist model of learning focused on student mastery of the process rather than outcomes. The team named their approach RELM (Resource-Enriched Learning Model). RELM focuses more on faculty development and less on the course; the instructional designer acts almost as a mentor helping faculty to learn and strengthen their own instructional design skills. The University of Connecticut’s collaborative approach and process allows information literacy to be integrated throughout the curriculum. Skills can be scaffolded not only over several assignments in a course but also throughout the entire program’s curriculum. This case study is unique in its focus on the special information literacy needs of graduate students and is a good description of an ideal structure for integrating course design and information literacy.
McClure, R. & Cooke, R. (2011). The search for the Skunk Ape: Studying the impact of an online information literacy tutorial on student writing. Journal of Information Literacy, 5(2), 26–45. http://dx.doi.org/10.11645/5.2.1638
McClure and Cooke investigate the impact of an online information literacy tutorial on English Composition students’ ability to select and evaluate sources, as well as use them in essays. Findings indicate that although students improved in their ability to locate and use appropriate sources, they still lacked the ability to correctly use in-text citations. There were also discrepancies between students’ in-text citations and their bibliographies. McClure and Cooke conclude that librarians need to work together with English Composition faculty to create a learning module for incorporating the use of in-text citations in the writing process, underscoring the importance of close collaborations between faculty and librarians when tailoring tutorials for a specific course or specific content in a course.
Mestre, L. S. (2010). Matching up learning styles with learning objects: What’s effective? Journal of Library Administration, 50, 808–829. doi:10.1080/01930826.2010.488975
Mestre’s study examines whether learning objects are designed to meet the varied learning styles of culturally diverse student users. Mestre asserts that although most librarians do not take student preferences into account, most students prefer learning objects with multiple modalities, including images, sound, and interactive elements. Although limitations of the study include a small student sample and lack of diverse ethnic representation, Mastre’s finding does underscore the importance of addressing learning object design from a pedagogical and user standpoint and stresses that the success of student engagement and learning through these objects are likely dependent on these factors.
Mestre, L. S., Baures, L., Niedbala, M., Bishop, C., Cantrell, S., Perez, A., & Silfen, K. (2011). Learning objects as tools for teaching information literacy online: A survey of librarian usage. College & Research Libraries, 72(3), 236–252. doi: 10.5860/crl-130rl
This article is based on a survey conducted in 2008 by the Online Learning Research Committee of the ACRL Education and Behavioral Sciences Section (EBBS) which was born out of discussion sessions. The survey aimed to determine what online teaching applications librarians are using and how they are designing learning objects embedded in course management systems. Pedagogical considerations, librarian expertise and training, and faculty/librarian relationships are discussed. The study determined that there are definite challenges relating to all three factors mentioned, resulting in the creation of the Librarian’s Toolkit for Online Course Development to support librarians faced with these challenges.
Schimming, L. M. (2008). Measuring medical student preference: A comparison of classroom versus online instruction for teaching PubMed. Journal of the Medical Library Association: JMLA 96(3), 217–222. doi: 10.3163/1536-5050.96.3.007
The article describes a training program and follow-up test for a large group of medical students. The author reports on student participation and response to an online PubMed tutorial and skills assessment created by university librarians. The study’s findings indicate that students participating in a self-guided, online tutorial passed the PubMed skills assessment at the same high rate as students who attended training in person. Results also suggest that students may prefer the flexibility and control of self-guided online training. Most important, this article provides evidence of the positive attributes of asynchronous library instruction and a method of assessing similar programs.
Searing, S. (2012). In it for the long haul: Lessons from a decade of assessment. In T. Peters & J. Rundels (Eds.), The Fifteenth Distance Library Services Conference Proceedings (pp. 291–313). Mount Pleasant, MI: Central Michigan University.
Reviewing ten years of distance education student evaluations, Searing suggests assessment criteria for library orientation and instruction. Her work exemplifies how cyclical assessment processes can improve library instruction. In addition to her example, she provides helpful classifications for evaluating library instruction.
Smith, S. S. (2010). Evaluation, testing, and assessment. In Web-based instruction: A guide for libraries (3rd ed. ed., pp. 177–186). Chicago: American Library Association.
Smith gives a succinct overview of evaluation and testing methods for assessing the effectiveness of software and instructional design processes as well as the overall effectiveness of completed projects. She describes categories of evaluation (formative and summative) and evaluation methods (user, usability, and inquiry). Several methods are described for each of evaluation method, allowing readers to select the best assessment for their project. She concludes by touching on assessing content mastery. This chapter is a good starting point for those with little knowledge of assessment and evaluation methods. The book’s Appendix has an extensive list of resources to guide readers to additional, in-depth information.
Stagg, A. & Kimmins, L. (2012). Research skills development through collaborative virtual learning environments. Reference Services Review, 40(1), 61–74. http://dx.doi.org/10.1108/00907321211203630
Stagg and Kimmins evaluated the success and challenges of a virtual learning environment (VLE) created in collaboration between the Library, Learning and Teaching Support and the Faculty of Business and Law at the University of Southern Queensland. The VLE called My Business Study and Research was created in a course management system and consisted of screencasts aimed at distance students to support their deeper learning or understanding of the research process or a specific content. Student feedback indicates that the VLE and screencasts were used heavily at “point of need” and were valued by students. Although the study was unable to determine if deeper learning does indeed take place in the VLE, similar online learning spaces would be the ideal place to provide consistent learning support to students throughout their academic careers.
Washburn, A. (2009). Finding the library in Blackboard: An assessment of library integration. Journal of Online Learning and Teaching, 4(3), 301–316. http://jolt.merlot.org/vol4no3/washburn_0908.pdf
Lee Library at Brigham Young University used survey methods to answer questions about whether university students found the library’s efforts to infuse Blackboard with a section devoted to library research useful. This survey study established that Course Research Pages, when integrated into course management systems, can help students, but there are ongoing challenges to implementation. Over time difficulties related to maintaining sustained awareness of these resources can appear, and much depends on positive collaborations between librarians and faculty. This article is most relevant to libraries that are trying to integrate library resources into a course management system and who are exploring the processes and challenges of marketing and sustaining ongoing awareness of these services to university communities.
Weschke, B., & Canipe, S. (2010). The faculty evaluation process: The first step in fostering professional development in an online university. Journal of College Teaching & Learning, 7(1), 45–58.
Weschke and Canipe outline an extensive faculty evaluation program in a distance learning environment. They describe a culture of improvement that is created by compiling student course evaluations, faculty self-assessments, checklists of activity, and adherence to rubrics. This article will be helpful for those who teach a library course and are interested in developing a thorough evaluation program.