Hi, I am Paula Egelson and I am the director of research at the Southern Regional Education Board in Atlanta and a CREATE board member. Much of my current research and evaluation work centers on secondary career technical education (CTE) program effectiveness for teachers and students. The fidelity of implementation, or the degree to which an intervention is delivered as intended, for these programs is always a big issue.
Hot Tip: Pay Attention to Fidelity of Implementation as Programs Roll out
What we have discovered over time is that factors that support fidelity of implementation crop up later in the program development process more than we ever expected. For example, CTE programs are usually very equipment heavy. During the field-testing stage, we discovered that due to a variety of vendor and district and state ordering issues, participating schools were not able to get equipment into their CTE classrooms until much later in the school year. This impacted teachers’ ability to implement the program properly. In addition, the CTE curricula is very rich and comprehensive which we realized required students to have extensive homework and ideally a 90-minute class block. Finally, we discovered that many teachers who implemented early on were cherry picking projects to teach rather than covering the entire curriculum.
Once these factors were recognized and addressed, we could incorporate them into initial teacher professional development and the school MOU. Thus, program outcomes continue to be more positive each year. This speaks to the power of acknowledging, emphasizing and incorporating fidelity of implementation into program evaluations.
Rad Resource: Century, Rudnick, & Freeman’s (2010) American Journal of Evaluation article on Fidelity of Implementation provides a comprehensive framework for understanding the different components of Fidelity of Implementation.
The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Thank you for an enlightening post about your experiences with FOI in educational settings. As a front-line educator concurrently doing graduate work related to program evaluation, I appreciated your forthright statement, “Factors creep up (affecting fidelity to programming) more than we ever expected. Certainly, in overburdened schools with competing interests and rigid bureaucracies, FOI can waylay the best of programs.
The referenced article in your blog entry addresses an updated framework for evaluating fidelity of implementation by addressing the critical components of fidelity. The framework is approachable through its explicit delineation of factors including structural procedural, structural educative, instructional pedagogical and instructional student engagement. Century, Rudnick and Freeman’s (2010) case example further highlights the extensive and potential fidelity variables and their placement within the framework.
If you used this framework for the case study you described in your blog, the issues affecting FOI can easily be summarized. For instance, the delay of materials and the issue with fulfilling the full 90-minute teaching block can be placed within the structural procedure category. You addressed a common issue I have observed with implementation of educative programs; you wrote, that teachers were “cherry picking” elements of the curriculum for instructional delivery. Initially, I wondered if this represented a structural educative fidelity issue; that is, do teachers have adequate PD to understand the necessity of adhering to the full curriculum. However, is it possible that this particular issue could also be representative of instructional pedagogical problems with program implantation if teaches are not facilitating the presentation of materials as prescribed?
Thank you for providing your rad resource, the article has helped with my own emerging understanding of fidelity of implementation for education-based programming and evaluation.
Century, J., Rudnick, M., and Freeman, C. (2010) A Framework for Measuring Fidelity of Implementation: A Foundation for Shared Language and Accumulation of Knowledge American Journal of Evaluation 31 (2) 199-218
Dear Paula,
I am at a school that is implementing a new curriculum framework. Part of the program implementation also involves bringing in ICT. What resonated with me from your post was how unforeseen obstacles to program fidelity can crop up during program implementation or development. We had planned to bring in an interactive white board and student ipads to enrich the resources available to the students and one year later, due to circumstances only half of that order has been filled. These obstacles have made it difficult for teachers to integrate ICT when they do not have or are lacking the adequate resources.
Your post reminds us of the importance to reflect on the program plan and program fidelity frequently to make sure that participants have the means to carry out the intentions of the program.
Sincerely,
Jenny
Dear Paula Egelson,
Thank you for your post concerning fidelity in program implementation and evaluation. I am a student at Queens University, currently taking a course on program evaluation and your post inspired deeper consideration of fidelity as an aspect of evaluation.
I am involved in a project involving the evaluation of a program with the goal of implementing student-led inquiry in word study to increase curiosity and motivation of learning. Evaluation of a program aimed at changing teaching practice, fidelity will play a large role as you experienced in the CTE program. Program effectiveness is clearly a result of the delivery of the program as well as the commitment of the teachers in addition to the program itself.
While you emphasize the importance of taking into account fidelity, do you see it as a reflection following the evaluation or a built in part of the program theory? In your experience, would it be more effective to evaluate a program after teachers have more experience with student-led, inquiry-based learning? The results of a program evaluation would likely have fidelity playing a larger factor in the initial implementation phase than programs in the mature implementation stage. Although highly dependent on fidelity, initial implementation evaluation could provide inspiration, motivation and promote change of practice if the program is implemented properly with support. In addition, if evaluation is affected by the users’ attitude and involvement would fidelity be less of a factor in a participatory or utilization evaluation? Any feedback to enhance my learning would be appreciated.
I will take from your article the necessity of acknowledging, emphasizing and incorporating fidelity of implementation into my program evaluation. The awareness of program implementation fidelity will help to foresee and address these interfering factors. Thank you for providing a resource from the American Journal of Evaluation to further my understanding of the role of fidelity in evaluative practice.
Sincerely,
Cindy Charland