I’m Amy Gaumer Erickson, Ph.D., and I’m an assistant research professor at the University of Kansas. My work focuses on the evaluation of effective instructional practices within MTSS (multi-tiered system of supports) educational models. I’ve found that though we expect educators to implement interventions with fidelity, we don’t always evaluate the professional development provided to educators.
Hot Tip: Educational evaluation must ensure that professional development is of high quality prior to holding educators responsible for the fidelity of implementation.
To evaluate the quality of training, the U.S. Department of Education Office of Special Education Programs has identified five indicators based on implementation science research:
- Accountability for delivery and quality monitoring of training is clear.
- Adult learning principles are used.
- Training is skill-based.
- Outcome data is collected and analyzed (pre- and post-testing) to assess participants’ knowledge and skills.
- Trainers are trained, coached, and observed. Data are used to improve trainer skills and the content of training.
External evaluators on State Personnel Development Grants use these indicators to guide their evaluation of training. In practice, my colleagues and I have observed numerous trainings and workshops to evaluate these trainings’ alignment to the projects’ learning objectives and assess trainers’ use of adult learning principles and participant rehearsals. We’ve also supported trainers in their development and deployment of knowledge-based pre-/post-assessments. Data collected from the assessments and observations provides trainers with evidence of their effectiveness and is used to coach them in areas in which they could improve. The result is higher training fidelity.
After observing educational trainings across a variety of content areas and conducting a review of research-identified indicators of effective professional development, we developed an observation tool that evaluates the use of adult learning principles through training practices relating to demonstration, engagement, evaluation, and mastery. This tool has supported trainers in developing their content and continuously improving training. Within train-the-trainer models, this tool has been used alongside content-specific pre- and post-testing to ensure that training fidelity is maintained throughout the scale-up process.
Rad Resource: Download the Observation Checklist for High-Quality Professional Development Training, and use it to observe the quality of the professional development provided to educators.
To learn more about the development and implementation of the Observation Checklist for High-Quality Professional Development Training, watch our AEA Ignite Session.
The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.