I’m Meklit Berhan Hailemeskal, an evaluation specialist with Deloitte Consulting LLP. I work with both US federal agencies and global development agencies to plan, design, and implement program evaluation and performance measurement initiatives. Working fluidly across federal and global health programs, I have learned specific lessons and found helpful resources from one side that can readily be useful for the other. Today I want to share a couple of useful resources from the global health evaluation space that I believe can be valuable for the federal evaluation space.
My work with federal health programs often involves supporting grantees in one form or other in implementing the required performance measurement and evaluation activities. The questions I hear often from grantees is how the data provided to the funding agency are going to be used and how they can be useful to the grantee. While I will not attempt to answer those questions here, I would like to share some examples of learning platforms used to facilitate evaluation data use from the global evaluation space.
Lesson Learned: Use of evaluation results requires an intentional and systematic approach to translate evaluation/performance measurement findings into realistic and meaningful programmatic recommendations, and a mechanism to work with program managers to monitor the implementation of those recommendations.
Rad Resource: The Independent Evaluation Group for the World Bank Group maintains a Management Action Record Database to document and monitor post-evaluation action. The database lists the key findings and recommendations that emerge from evaluation findings and tracks the progress of implementation of these recommendations at the program level. This database serves as a tool to promote and build accountability for the use of evaluation results.
Lesson Learned: From time to time, it is necessary to reflect in a systematic way on what the impact of evaluation/performance measurement actually is – what have we collectively learned from our evaluation/performance measurement efforts and how has that influenced how we work and what we are able to achieve? Having a systematic process and standardized tools to facilitate this reflection helps hold evaluators and program teams accountable for incorporating evaluation findings and recommendations into program planning and implementation.
Rad Resource: Better Evaluation recently published Evaluations That Make a Difference – a collection of stories from eight countries about how evaluation results (and processes) have been used to influence change within organizations and the lives of people. These stories are great illustrative examples of the meaningful difference that evaluation can bring about when there is intentional and strategic reflection on evaluation results.
While the resources presented are from a global context, I have found that their intent and use continues to inspire and influence domestic evaluations.
The American Evaluation Association is Deloitte Consulting LLP’s Program Evaluation Center of Excellence (PE CoE) week. The contributions all this week to aea365 come from PE CoE team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.