Resources

Resources included in these libraries were submitted by ITEST projects or STELAR and are relevant to the work of the NSF ITEST Program. PDFs and/or URLs to the original resource are included in the resource description whenever possible. In some cases, full text publications are located behind publishers’ paywalls and a fee or membership to the third party site may be required for access. Permission for use must be requested through the publisher or author listed in each entry.

Body

Resources included in these libraries were submitted by ITEST projects or STELAR and are relevant to the work of the NSF ITEST Program. PDFs and/or URLs to the original resource are included in the resource description whenever possible. In some cases, full text publications are located behind publishers’ paywalls and a fee or membership to the third party site may be required for access. Permission for use must be requested through the publisher or author listed in each entry.

521 - 530 of 543

A Program Director's Guide to Evaluating STEM Education Programs: Lessons Learned from Local, State, and National Initiatives

Publication

In today's world of high accountability, strong evidence on intended outcomes is key to building credibility and replicability of science, technology, engineering and mathematics (STEM) programs. This primer, for program directors/managers, educators and others responsible for developing and implementing STEM programs: Provides evaluation guidelines and resources for program leaders who are implementing STEM programs in schools and community-based organizations. Reduces "evaluation anxiety" for individuals who are not professional evaluators by providing guidelines for good evaluation

Read More

How Can Multi-Site Evaluations Be Participatory?

Publication

Multi-site evaluations are becoming increasingly common in federal funding portfolios. Although much thought has been given to multi-site evaluation, there has been little emphasis on how it might interact with participatory evaluation. Therefore, this paper reviews several National Science Foundation educational, multi-site evaluations for the purpose of examining the extent to which these evaluations are participatory. Based on this examination, the paper proposes a model for implementing multi-site, participatory evaluation.

Read More

A Guide to Evaluation Primers

Publication

Produced by the Association for the Study and Development of Community (ASDC) for the Robert Wood Johnson Foundation, this document offers an orientation to handbooks and basic primers on evaluation. These resources are designed to meet the needs of the non-expert, explaining some of the central issues in evaluation and why they are important. The Guide itself helps organizations to assess which evaluation primer might be most useful for their needs and what to expect, in terms of information, from each.

Read More

Lessons Learned About Science and Participation from Multisite Evaluations

Publication

Publicly funded multisite evaluations should help bring safe, effective interventions to the mental health and substance abuse treatment fields. Two principles, science-based practice and stakeholder participation, drive multisite evaluations of behavioral interventions. We examine the roles of these principles in the five programs described in this volume and draw lessons for future studies.

Read More

OERL Evaluation Reports for ITEST Projects

Publication

OERL, the Online Evaluation Resource Library, was developed for professionals seeking to design, conduct, document, or review project evaluations. The purpose of this system is to collect and make available evaluation plans, instruments, and reports for NSF projects that can be used as examples by Principal Investigators, project evaluators, and others outside the NSF community as they design proposals and projects. OERL also includes professional development modules that can be used to better understand and utilize the materials made available.A collection of evaluation report resources for

Read More

The ITEST Learning Resource Center's Online Evaluation Database: Examples from the Collection

Publication

The ITEST Learning Resource Center at EDC developed an online database of instruments developed by ITEST project evaluators and researchers from 2003 to 2007. This 2007 article details the purpose and development of that database and highlights three instruments from it that represent the kind of evaluation tools archived there. While the database is no longer available, the three instruments described in the article provide useful examples of project-developed evaluation tools. More information about the database and the instruments can be obtained by emailing stelar@edc.org.

Read More

Inclusion, Disabilities, and Informal Science Learning

Publication

Inclusion, Disabilities, and Informal Science Learning, a report by the CAISE Access Inquiry Group, offers a theoretical framework for thinking about inclusion of people with disabilities in informal science education (ISE), then reviews current practice in museums (broadly defined), in media and technology, and in youth and community programs.

Read More

Search for Resources

Multiple criteria within a field is an OR condition. Multiple fields are AND conditions.
Resource Type