Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

PD Presenters Week: Monica Oliver and Krista Collins Creating an Index for Measuring Fidelity of Implementation

Hi! We are Monica Oliver and Krista Collins, evaluation practitioners in Atlanta, GA. Through our extensive work with the US Department of Education (e.g. Investing in Innovation (i3), Race to the Top-District), we have constructed implementation fidelity frameworks for several complex, multi-level education programs.   At the past two AEA annual conferences we, along with our colleagues, presented a step-by-step process for working with stakeholders to compute a fidelity index, or an overall summative score that assesses the extent to which the program in reality aligns to the program in theory. We demonstrated how to work with program staff in various contexts and stages of program development to identify the program’s core components, select quantifiable fidelity criteria and compute fidelity scores to provide program staff with concrete information to guide implementation decisions.

We’ve expanded this conversation into an interactive half day workshop at this year’s conference. Our goal is for participants to walk away with the following information:

  • How measuring implementation fidelity can foster thorough process evaluation and inform programmatic decision-making
  • What a fidelity index is and how it can dovetail with an impact study
  • How to construct a fidelity index, including engaging stakeholders in the process
  • How to compute a fidelity index once it is constructed and data is collected
  • How to interpret fidelity indices and utilize them for program improvement, and
  • How fidelity indices need to be modified as programs develop and age.

Cool Trick: Approach fidelity assessment by first identifying your program’s key direct service components. This is harder than it sounds: not every program activity is a direct service, and not every component is key. Naming the key service components of your program at the outset will give you a running start toward assembling a comprehensive fidelity index.

Rad Resource: Look for our presentation materials in the AEA Public Library to get ready for the workshop.

  • 2013: Computing a Fidelity Index on a Program with Multiple Components
  • 2014: Assessing Program Fidelity across Multiple Contexts: The Fidelity Index, Part II

Rad Resource: The David P. Weikart Center for Youth Program Quality conducted a Youth Program Quality Intervention Study to understand best practices in youth program implementation. They describe a detailed institutional process for assessing implementation fidelity within an after-school setting to validate an effective and sustainable intervention model.

Rad Resource: Use this Tiered Fidelity Inventory developed by the U.S. Department of Education’s Office of Special Education Programs Technical Assistance Center developed to measure school-wide Positive Behavioral Intervention and Supports.

Want to learn more? Register for Creating an Index for Measuring Fidelity of Implementation at Evaluation 2015 in Chicago, IL.

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2015 in Chicago, IL. Click here for a complete listing of Professional Development workshops offered at Evaluation 2015. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.