Hello! We are Kari Selander, Emma Jones, Soren Vester Haldrup and Claire Hutchings. We work at Oxford Policy Management, an international development consultancy with offices in Asia, Africa, Europe and North America. Here, we reflect on an evaluation we are undertaking – to share what we’re up to and to ask you to share your experience and insights.
As the demands on evaluation grow, the field itself is evolving from a group of evaluators to a profession of evaluative thinkers. It means pushing more creative and responsive evaluation designs that can grapple with increasingly complex questions and issues while delivering rigour in new ways. We think this is a good thing. Do you?
We are undertaking an evaluation with the Open Government Partnership (OGP) that falls into this category. OGP is a partnership of governments and civil society that has coalesced to foster accountable, responsive and inclusive governance.
The evaluation of the OGP is a two-year process designed to be rigorous, to deliver ‘answers’ to high level evaluation questions, but to do so in a way that puts learning front and center. We have developed an evaluation approach that recognises the complexity of these change processes while delivering on rigor and real time learning.
There is no one methodological or analytical approach that is going to get us the answers OGP would find most helpful, so we are bringing a bunch of approaches to bear, under a Developmental Evaluation umbrella:
- Using different evaluation designs to answer sub-questions rather than marrying the entire evaluation to a specific approach. This allows us to benefit from the methods’ strengths without being confined by their limitations. For example, we are using contribution tracing to answer questions about causality while hoping to use QCA to look across heterogeneous cases to identify those combinations of factors that successfully influence change.
- Embedding researchers in the field so that they can have dynamic conversations with key stakeholders over the life of the evaluation. Researchers are documenting change processes in real time, staying open to following the story where it takes them.
- Safeguarding a flexible fund that will enable us to explore new questions, new change processes, new outcomes over the two years. This will allow us to lean into findings as they emerge without undermining our ability to deliver on the questions OGP have now.
It’s complicated, with a lot of moving parts. But we think it will be worth it. More of this work is being championed by some corners of the bilateral (shout out to USAID’s Innovation Lab) and philanthropic world. Understandably, program teams are cautious – this new approach to evaluation takes a lot of their time, and requires a different approach to the relationship between commissioner and evaluator. It’s too early to tell if we’ll manage a true ‘developmental evaluation’ or happily settle for an utilisation-focused evaluation (see Tanya Beer’s excellent blog post on this). But we hope by reflecting as we go, we also grow while sharing its value and pitfalls.
The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Dear Sheila Robinson,
I appreciate you sharing the work of Kari Selander, Emma Jones, Soren Vester Haldrup and Claire Hutchings and their innovative approach to evauation design. I particularly appreciate their use of the term ‘evaluative thinking,’ as I think those of us who are not a part of an evaluation can also apply this type of thinking when looking to solve problems in our own particular settings. Further by approaching evaluation in this more complex way I feel that evaluators may be better able to process findings successfully contributing to more positive and lasting change. I particularly appreciated your flexible funding component as too often, I imagine, new areas of need appear during an evaluative process but these needs may not directly link to an evaluation’s guiding question. To provide this additional source enables evaluators to honestly and effectively acknowledge research findings in whatever way they present themselves, leading to a more thorough investigation to an evaluation’s question.
Thank you again for sharing this innovative shift in evaluation. When grappling with systemic issues, one can only assume that by continuing to innovate and move towards a more flexible and varied design approach more enduring answers will appear.
Erin Rochfort