Module 5 - Evaluation

Overview | Quality | Eval Questions | Instruments | R & EProposal Steps | Resources

Instruments

Choosing instruments for both research and evaluation questions

We have included instrumentation in the Evaluation module, but many of these suggestions are equally applicable to instruments used to answer your research questions.

Should you design an instrument specific to the evaluation needs in your project?

  • Most ITEST projects do not have the budget or time to develop a quality new instrument. It can take more than a year and multiple large samples of respondents.
  • Using an instrument that doesn’t have evidence of validity and reliability is like using a scale that hasn’t been calibrated – your measurements may be prone to error.
  • Always pilot an instrument with respondents from your target population, even one that has been validated previously. Sometimes the language or format of an existing instrument needs to be revised to be appropriate, understandable, or culturally relevant for a different group of people. Some instruments were developed with one population and may function very differently with another population. And some instruments may be harder to administer in some settings than others. Conducting a “think-aloud” exercise or a cognitive interview can be a good way to check how respondents understand the items.

How can you select from among existing instruments?

  • Does the instrument measure what you want to measure? Look at both the items and the description of the constructs as designed by the authors.
  • Is the instrument intended for the purpose for which you want to use it? (for example, a summative evaluation tool may not work well for formative or diagnostic purposes) Was the instrument developed for your target age group?
  • What evidence of validity and reliability is available for the instrument?
  • Have you identified an independent evaluator or expert who can review instruments for methodological rigor?

How can you find measurable outcomes for your evaluation?

  • SMART: Specific, Measurable, Achievable, Relevant, Timely
    • Specific: for example, identify numbers of students you plan to reach.
    • Measurable: look for outcomes that have existing instruments that meet criteria described above
    • Achievable: do you expect participants to change a certain way?
    • Relevant: you would expect the outcome to result from the intervention
    • Timely: within the timeframe of the grant
Body

 

Course Homepage

This course is being preserved for historical purposes. While the project has ended, the materials remain highly relevant for proposal development and can still serve as a valuable resource for NSF proposal writers. The course is no longer maintained, and some content may reference past initiatives or deadlines.