ARC-REESE Criteria & Guidelines for Rating the Methodological Rigor of Educational Research in STEM
Description
ARC was asked by NSF to conduct a pilot project to review the research methodologies employed by a sample of projects funded by the REESE (Research and Evaluation on Education in Science and Engineering) program. ARC convened an expert panel in consultation with NSF to develop standards and a rubric for rating the rigor of REESE projects’ methodologies, with the ultimate goal of reporting on the methodologies employed in the REESE program overall.
Panelists concurred that the guidelines provided in the American Educational Research Association’s (2006) Standards for Reporting on Empirical Social Science Research in AERA Publications provided a sound basis for developing standards for assessing methodological rigor, although recognizing that there is not a direct relationship between good reporting and good science, the panel was concerned to distinguish the two. Two additional panels commented extensively on preliminary guidelines for raters based on their experiences assessing materials from a sample of 24 completed REESE projects.
Accordingly while the following guidelines for raters quote extensively from the AERA Standards (in some instances with slight word changes
to facilitate meaning and understanding), the panelists recommended numerous modifications and elaborations to the AERA Standards to provide guidelines for evaluating the rigor of completed work, and examples have been included to illustrate the types of evidence that should lead a rater to conclude that specific rigor standards have been met.
Author and publisher information is provided below. Note that many publishers charge a fee or membership for full access. Permission/access must be requested through the publisher or author directly.