Hello! This is Heidi Kahle and Lauren Toledo, evaluators with Deloitte Consulting, LLP. We work with U.S federal health agencies to plan, design, and implement program evaluations. Today, we’re writing with our thoughts on ways to use outcome harvesting (OH) to evaluate interventions, programs, and policies implemented in rapidly changing environments where traditional evaluation methods are not the best fit for understanding effectiveness, relevance, and sustainability.
During recent public health emergencies – particularly in the face of the COVID-19 pandemic – we saw firsthand how interventions were rapidly created or adjusted to meet fast-changing needs. In some cases, traditional methods of program monitoring and evaluation that focus on measuring anticipated outcomes driven by a theory of change have not been feasible due to the quick planning and implementation necessary to meet the public health crises.
As we began to think about different methods that we could apply to help organizations evaluate their efforts when there is less time on the front-end to pre-determine benchmarks or build an extensive theory of change, OH emerged as a potential solution. Developed in 2002 by Ricardo Wilson-Grau and colleagues, outcome harvesting uses a retrospective approach to collect evidence and information and then works backwards to determine how an intervention has shaped and contributed to outcomes. The harvest is a highly participatory process that incorporates the perspectives of a broad set of partners and can answer questions about an intervention’s effectiveness, relevance, and sustainability. Because OH is highly participatory, it requires evaluators to consider which actors should be included in the harvesting process and provides an opportunity to develop an equity-driven process, which is a high priority in the face of public health emergencies.
- Outcome harvesting | BetterEvaluation
- Outcome harvesting: Principles, steps, and evaluation applications by Ricardo Wilson-Grau (2018)
As we explored examples of how OH has been applied, we found numerous cases in international contexts, usually evaluating media or educational campaigns. We were eager to understand if or how OH could be used to evaluate domestic capacity-building programs, programs with broad discretionary funding, or other policies and programs. We haven’t found many examples of using OH in this way, we would love hear examples of how others have used OH or other participatory and retrospective evaluation methods.
As we learned more about OH, we also began thinking of how to use OH in tandem with other highly participatory tools, such as human-centered design (HCD). HCD facilitates deep connection with users and partners to identify solutions and solve problems. Like OH, HCD examines problems from an end-user point of view and is useful in situations that have a high degree of uncertainty or numerous interdependent factors that contribute to outcomes. While HCD is often used to reframe problems and create innovative solutions, the tenets of the HCD approach, and especially its use of divergent and convergent thinking processes, overlap with the iterative nature of outcome harvesting.
Rad Resources: Human Centered Design (HCD) | NIST
We haven’t seen anyone use HCD principles in conjunction with OH, and we’re curious to hear others’ thoughts on applying HCD and other highly participatory tools to OH to enhance their work. We invite you to reply in the comments with your thoughts on using or adapting additional tools for OH.
The American Evaluation Association is hosting Health Evaluation TIG Week with our colleagues in Health Evaluation Topical Interest Group. The contributions all this week to AEA365 come from our HE TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.