Greetings! We are Estelle Raimondo, an Evaluation Specialist at the World Bank, and Karol Olejniczak, an Associate Professor at the University of Warsaw. Like most of you we are evaluation nerds and we can’t wait to join thousands of you in DC in November to learn about “what works and why.” We had the opportunity to work with Prof. Newcomer on conceptualizing this year’s conference, so let us tell you how this particular strand came about and give you three “hot tips” for how to join the conversation.
Lessons learned: The theme of “learning what works and why” is primarily a call for collective reflections on what we may call the “learning paradox” that Aristotle eloquently articulated in his time: “the more you know, the more you know you don’t know.” For decades, the evaluation community in its wide diversity has gathered evidence about the effectiveness of a vast array of interventions throughout sectors and contexts. The conference is the perfect arena to deliberate on (1) what we know that we didn’t know, let’s say 10 years ago; (2) missed opportunities for cumulative knowledge; and (2) how we can convey this evidence to policy makers and practitioners.
Hot Tip #1: Even if you are not a methods geek like us, you may want to attend a session on the latest thinking on causal inference. Whether it is through advancement in systems thinking, experiments, or qualitative methods of causal inference, many of us are pushing methodological boundaries to crack the causal nut. For instance, Estelle has used process tracing to assess the impact of engaging citizens on the quality of public services in developing countries. If you are interested, you can join us in November for a demonstration session on the topic.
Rad Resource: A detailed guide on using QCA in evaluations
Hot tip #2: Attend a session that is not strictly in your field. If you are an education expert, why not join a session on what we have learned about effective service delivery in transportation or peace-building? That way we can test the generalizability of each other’s work by simply talking to one another. We bet you that given the common underlying behavioral and social mechanisms that affect interventions’ successes and failures, we have a lot to learn from each other.
Rad Resource: a professional network working on this
Hot tip #3: Learning what works and why is not useful if it doesn’t make it to the ear of practitioners and decision-makers from different communities. Try to participate in a session that ponders on this issue or learn from other fields, for instance on how to use games to test proposals for new regulations in a safe environment.
Rad Resource: an insightful article on the topic
We’re looking forward to November and the Evaluation 2017 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.