Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Using Developmental Evaluation to Design and Implement Public Health Programs During Times of Crisis by Emily Costello, Julia Bleser, Tracy Wharton, Shakiera Causey, and Oscar Espinosa

Hello, we are Emily Costello, Julia Bleser, Tracy Wharton, Shakiera Causey, and Oscar Espinosa, and are the National Network of Public Health Institutes’ Research and Evaluation Team. For the last few years, we have been collaborating with the Centers for Disease Control and Prevention to evaluate public health-focused programs addressing many of the challenges and negative impacts the COVID-19 pandemic continues to have on the public health of the nation.

The pandemic demanded the development of new programs to train staff on the use of personal protective equipment, prepare and respond to hurricanes, and rapidly transfer knowledge from experienced disease investigators to early career staff, among other things. It required program developers to tailor program materials at lightning speeds to get programs up and running as quickly as possible.

As we extract ourselves from the daily grind of the pandemic, we are reflecting on strategies used to navigate heavy stress, insane timelines, and a steady stream of urgent needs. We have begun to see the first cohorts of program participants apply their newfound knowledge, and we now reflect on evaluation methods, theory, and practice we used to help shape these critical programs during a time when time to think and plan was severely compromised. 

Lessons Learned:

Evaluators need not be intimidated by compressed timelines for project development, but rather, use these timelines to highlight the benefits of evaluation. By prioritizing intentional planning and facilitated discussions with funders and program developers, evaluation can be leveraged to define program success and milestones early on. This is an essential first step to unpack the rationale for the program, identify what success looks like to those who are tasked with its implementation, and collectively draft a program logic model.

Rad Resource:

Michael Quinn Patton’s Utilization Focused Evaluation outlines the importance of facilitation when designing a useful evaluation for stakeholders. His practical approach includes the Utilization-Focused Evaluation Checklist, which can help facilitate these critical conversations with stakeholders.

Hot Tip 1: 

Break up facilitation sessions over multiple meetings to better understand the program at different times during program development. Getting everyone to articulate the program’s rationale and intended objectives will not be accomplished during the initial discussion. Evaluation should be a recurring agenda item during planning meetings, rather than an afterthought at the end of the program.

Hot Tip 2: 

Identify someone from your team or network with experience with the program, topic, or the skill(s) being taught to help the team understand what is truly valuable or important about the program.

Lesson Learned:

Rapid assessments following all program-related activities (recruitment, orientation, training materials, etc.) can give developers real-time information to inform improvements to both program processes and content.

Rad Resource:

Using a retrospective pretest design to measure knowledge change in a single assessment can be a practical alternative to administering multiple assessments before and after training sessions and can minimize respondent burden. Allen and Nimon’s 2007 review Retrospective Pretest: A Practical Technique for Professional Development Evaluation further outlines this design and its implications for evaluation.

Hot Tip 1:

Map all program activities and develop a list of required questions the funders and program developers need answered. This will provide you with important constructs to measure for each of those program activities or events.  Assessing session timing, overall satisfaction, unclear or misunderstood concepts, and recommendations for future program activities can provide many ideas for program improvement through continuous quality improvement.

Hot Tip 2:

Engage and integrate program participants’ experience and expertise as you develop constructs to add to the assessment.  


Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.