Hey there, I’m Michael Moses, a Senior Monitoring, Evaluation, and Learning Specialist at EnCompass LLC. We work with U.S. government agencies, private foundations, multilateral organizations, and others that are tackling complex development challenges, from systemic corruption to human trafficking. We help our clients and their partners:
- Develop and implement strategies for addressing these challenges
- Collect and use data to assess progress
- Capture and apply lessons to adapt and improve.
Those working in the social sector increasingly acknowledge that there are no out-of-the-box solutions for complex challenges. We know that simply replicating and scaling approaches across contexts is at best not useful, and at worst actively causes harm. Instead, those who seek change need to be able to work collaboratively and adaptively, hand-in-hand with local partners, and iterate their way to strategies and solutions that best fit the contexts in which they’re working. A utilization-focused, developmental approach to evaluation, which focuses on facilitating and connecting just-in-time reflection, learning, and action is essential to driving change in complex systems.
Lessons Learned
In practice, this sort of action-driven evaluation work looks radically different from client to client, and context to context. But over the years, we’ve seen that if you’re trying to help folks learn how to address complex challenges, the principles you bring to bear are just as important as the evaluation methods you use. Three principles seem especially important:
Function Over Form: Don’t be shy about picking and choosing from different methodologies and combining them in ways that strengthen their collective usefulness, such that we can generate the evidence and learning people need to make decisions. Adaptive bricolage (hat tip Tom Aston and his great articulation of this sort of approach) may not result in traditional-looking evaluations, but combining different methods—from most significant change to outcome harvesting to participatory action research—can and often does make for a more useful evaluation.
Collective Learning at the Center: Evaluation is about enabling collective learning and action, not just about accountability. Written reports are important, but providing opportunities for processing results and data, including as they emerge, is just as significant. Opportunities for collective, reflective learning—workshops, dashboards, before and after-action reviews, and more—help partners and clients link lessons to potential action and make informed choices about next steps.
Collaboration to Facilitate Inclusive Action: Evaluation decisions—including decisions around whether an evaluation is desirable in the first place, its focus, and intended users—should involve not just funders, but also the partners and colleagues with whom they work to bring about change. Ensuring that a variety of stakeholders are involved in all stages of the process helps to ensure that our evaluation work answers the needs of those most likely to make a difference, not just those paying for the work.
Balancing these principles, and ensuring they’re consistently put into practice, is hard! It takes effort, and often requires navigating tricky conversations with clients and partners. But when we get the balance right, we consistently see that best-fit evaluation work has promise. We’re excited to do more of this work in the future.
How do you balance form and function in your evaluation work? We’d love more partners as we chart the path toward more inclusive developmental evaluation efforts, and would love to hear from you.
Rad Resources
Aston, Tom (April 2020). Bricolage and Alchemy for Evaluation Gold: In this short blog, Tom Aston argues that combining and adapting evaluation methods can make for a stronger, more useful evaluation approach.
Darling, Marilyn. (2018). How Complex Systems Learn and Adapt: In this brief, Marilyn Darling explains complex adaptive systems theory, and explores how learning can speed adaptation.
Burns, Danny and Worsley, Stuart. (2015). Navigating Complexity in International Development. Practical Action Publishing: Burns and Worsley’s seminal book on how to embed learning in complex systems.
Falconer-Stout, Zachariah and Jones, Jonathan. (2020). Utilization-Focused Evaluation: Recommendations: two of my colleagues at EnCompass explain how to make evaluation recommendations useful and useable.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.