Disclaimer: The opinions and reflections expressed here are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.
Hello, we, Bilquis Khan Jiwani, Omoshalewa Bamkole, and Erin Black are from the Division of Workforce Development (DWD) in the National Center for State, Tribal, Local, Territorial Public Health Infrastructure and Workforce at the U.S. Centers for Disease Control and Prevention (CDC). Recently, DWD updated its strategic goals and approach to evaluation. This gave us an opportunity to create a logic model for DWD’s Office of Policy, Partnership, and Recruitment for the first time. We share our experience applying the Joint Committee on Standards for Education Evaluation (JCSEE) to our logic model development to demonstrate how the JCSEE standards can bring credibility and enhance the quality of evaluation products.
Incorporating the Evaluation Standards into Logic Model Creation
We started by reviewing existing documentation (e.g., organizational strategic plans, webpages, annual reports, and available data) to understand the policy office work and its context. After document review, we engaged policy office staff (i.e., interest holders) to understand the office’s vision and priorities in relation to DWD’s strategic goals. Over a 6-month period, we had one-on-one meetings and together in three strategic meetings. Here is how each evaluation standard was reflected in our process.
- Utility: We confirmed our understanding of each interest holder’s evaluation needs. Using their knowledge and expertise, we created a practical, responsive, and valuable product reflecting their work. An “aha” moment was achieved when some staff were pleasantly surprised to see how the logic model visualized their office strategies in connection to distinct activities and intended outcomes.
- Feasibility: We facilitated an environment for each interest holder to contribute. We did this by giving clear expectations for discussions, provided timely verbal and email communications, and made documents accessible for effective and efficient collaboration.
- Accuracy: We received continuous and iterative feedback from our interest holders by holding regular touchpoints with them, resulting in a context-specific, reliable, and valid logic model, fulfilling the office’s overall needs.
- Propriety: To ensure the logic model can guide the policy office team, we intentionally focused on balancing diverse perspectives of this group, which includes policy advisors, policy analysts, a data manager, and student interns. In our final strategic meeting, we presented the logic model to confirm that it clearly and fairly addressed needs and purpose of each interest holder.
- Evaluation accountability: The experiences allowed our interest holders to value the logic model and its potential to strengthen their work. They agreed to revisit it annually during strategic meetings, to renew their commitments and responsibility to program improvement.
Reflecting on the Experience
Our intentional focus on evaluation standards in developing this logic model helped us more fully understand our interest holders and their needs. Our interest holders shared that they felt included and contributed meaningfully throughout this process. Moreover, their understanding of the interconnectedness among strategies, activities, and outcomes was enhanced.
Takeaways
- Look at big picture: Align your work with an existing organizational strategic framework.
- Engage your interest holders: They help to clarify thinking and underlying assumptions, and to build consensus regarding the program’s mission, activities, and goals.
- Encourage and incorporate diverse perspectives: Ensure perspectives are not only heard and understood, but also included in the final evaluation product.
Rad Resources
- Read the Program Evaluation Standards publication for a deeper dive into the standards.
- Reference the Program Evaluation Standards using this checklist from Western Michigan University’s Evaluation Center.
Check out tips for involving others in logic model development from the University of Wisconsin-Madison’s Division of Extension.
The American Evaluation Association is hosting Gov Eval TIG Week with our colleagues in the Government Evaluation Topical Interest Group. The contributions all this week to AEA365 come from our Gov Eval TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.