Title
Developing a review rubric for learning resources in digital libraries
Abstract
Over the past 10-15 years, educational digital libraries (DLs) have acquired online learning resources of varying levels of granularity (e.g., from images to entire lessons) and of varying sources of authorship (e.g., grant-funded subject matter experts; K12 teachers; graduate students). The challenge is to balance collecting and providing access to online learning resources while maintaining a level of resource quality that distinguishes DLs from internet search engines. In response, many educational DL builders have established review rubrics. Although many rubrics have already been created, they are specific to each DL with little room for re-use outside of the original context. As such, our goals were 1) to synthesize the various dimensions of existing DL rubrics in order to identify a standardized set of criteria that could potentially be used by any DL with online educational resources [1] and 2) to create a review rubric for Instructional Architect (IA; http://ia.usu.edu) projects. IA is a simple, web-based authoring service that supports K12 teachers in finding and assembling online content into lessons for their classroom. To accomplish the second goal, we developed an [A-specific rubric based on prior literature; evaluated its utility and usability with middle school science and math teachers; tested reliability; and, explored how the rubric could foster teacher skills in designing learning resources. Ultimately, reviewed projects will be included in educational DLs, such as the NSDL. After creating the initial IA review rubric [2], we further modified it and conducted formative evaluations during Fall 2007-Spring 2008 with 25 participants, including K12 teachers, researchers, school library media specialists, and administrators. In Fall 2008, we conducted a summative evaluation of the rubric [3]. Participants (N=28) were part of a cohort of U.S. K-12 teachers in an online graduate program, and who completed required activities as part of an online course. Complete data were received from 17 participants. The participants took part in an online learning module in the context of learning how to use the IA and the review rubric. The results of our evaluation indicate that participants found value in the review rubric as a means to improve the quality of their projects through completing and receiving reviews. Teachers reported that before the course module they evaluated an online resource for "fit with the curriculum," "accuracy," "ease of use," "currency," "text readability," and "recommendations by others." After participating in the module they added to their evaluation criteria: "content quality," "distractions on the resource pages," "credibility of the site," and "will it engage participants." Many of the criteria they added were items listed in the review rubric. Thus, it appeared that use of the rubric helped refine participants' approach to designing learning resources. Participants reported that using the rubric and rating their peers projects helped them be more thoughtful when creating their own online learning resources. Future work will include creating a workflow for using an external review committee to evaluate projects for inclusion into the National Science Digital Library.
Year
DOI
Venue
2009
10.1145/1555400.1555493
JCDL
Keywords
Field
DocType
digital library,review rubric,online education,rubric,digital libraries,ease of use,search engine,subject matter expert,formative evaluation
Online learning,Rubric,Computer science,Usability,Digital library,Multimedia
Conference
ISSN
Citations 
PageRank 
2575-7865
2
0.42
References 
Authors
10
4
Name
Order
Citations
PageRank
Heather Leary191.65
Sarah Giersch26810.92
Andrew Walker34514.93
Mimi Recker410854.75