Hello! I am Heather Britt, Senior M&E Specialist with Social Solutions International, the contractor for the United States Agency for International Development (USAID) Expanding Monitoring and Evaluation Capacities Building Task Order (MECap). I co-authored the Outcome Harvesting brief with Ricardo in 2012 and I have been supporting the use of Outcome Harvesting (OH) and other complexity-aware approaches in USAID programming since 2013.
Donors, evaluation commissioners and managers play a critical role in creating an enabling environment for complexity-aware approaches such as OH. They structure evaluation processes and provide resources that support participatory and iterative processes of design and implementation.
The OH evaluator and primary intended user co-design the OH and collaborate on major decisions throughout the harvest. A launch workshop is often central to OH co-design. In the workshop, the primary intended user provides the OH evaluator with information about the evaluand, its context, and the decisions that will be informed by the OH. The evaluator coaches users about OH and facilitates the design to ensure that users make informed decisions. The co-design process ensures that the OH meets the specific information needs of the primary intended user.
Many institutionalized processes for planning, contracting and managing evaluations are not well suited to participatory and adaptive approaches, such as OH. In some organizations, processes for planning and procuring evaluations may hinder co-design by making important decisions without the input of the OH evaluator or setting aside insufficient resources and time for co-design. In others, the organization hands key decisions to the evaluator thereby limiting organizational buy-in.
Hot Tip: Evaluation commissioners and managers support a successful OH by including participation and adaptive management as guiding principles in the evaluation scope of work, funding a co-design launch workshop, and ensuring primary intended users take part in co-design at launch and at key points during the harvest.
The harvest process will inevitably require adjustments during implementation. The evaluation plan should allow evaluator and primary intended users to adapt the plan together. Large programs with multiple stakeholders may struggle to implement OH adaptively. Decision making slows as the number of stakeholders and their degrees of separation from the evaluation increases. Evaluation managers should be proactive about facilitating effective and efficient decision making to keep the OH moving forward.
Hot Tip: Distinguish who will take an active role in evaluation decisions (primary intended users) from those consulted or informed. Convene an evaluation steering committee and clarify members’ roles.
Rad Resources:
Outcome Harvesting: Promises and Pitfalls of Participatory Design: This AEA presentation outlines ways that evaluation commissioners and evaluators can work together to overcome common challenges to commissioning and managing successful OH evaluations.
Eight Promising Practices for M&E in Support of Adaptive Management: This document presents early findings from research on USAID’s use of complexity-aware approaches (including OH). Consider findings #6 and #7 when drafting an OH work plan and schedule. Finding #5 is relevant when recruiting members of the evaluation team.
Hot Tip: Contact the OH forum to identity a qualified OH practitioner to lead or mentor you in the harvest.
The American Evaluation Association is celebrating Outcome Harvesting week. The contributions all this week to aea365 come from colleagues of the late Ricardo Wilson-Grau, originator of Outcome Harvesting, and these articles are written in his honor. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Dear Heather:
I have enjoyed reading your work on OH, and have worked on an evaluation project where we used OH to evaluate the implementation of an educational innovation. Would you recommend that the harvester be separate from the actual project? In the project where we used OH, I was part of the implementation team, then I was the primary lead to harvest artifacts from our project, develop outcome statements. Outcomes (that were developed by me) were substantiated by project stakeholders, however, I wonder if the data would be more rich if the harvest was done by an external evaluator. I acknowledge it would have been less bias as well. What is the standard for this? Thanks so much for all of your work on this innovative approach to evaluation!